WorldWideScience

Sample records for standard commercial software

  1. Prediction of ice accretion and anti-icing heating power on wind turbine blades using standard commercial software

    International Nuclear Information System (INIS)

    Villalpando, Fernando; Reggio, Marcelo; Ilinca, Adrian

    2016-01-01

    An approach to numerically simulate ice accretion on 2D sections of a wind turbine blade is presented. The method uses standard commercial ANSYS-Fluent and Matlab tools. The Euler-Euler formulation is used to calculate the water impingement on the airfoil, and a UDF (Used Defined Function) has been devised to turn the airfoil's solid wall into a permeable boundary. Mayer's thermodynamic model is implemented in Matlab for computing ice thickness and for updating the airfoil contour. A journal file is executed to systematize the procedure: meshing, droplet trajectory calculation, thermodynamic model application for computing ice accretion, and the updating of airfoil contours. The proposed ice prediction strategy has been validated using iced airfoil contours obtained experimentally in the AMIL refrigerated wind tunnel (Anti-icing Materials International Laboratory). Finally, a numerical prediction method has been generated for anti-icing assessment, and its results compared with data obtained in this laboratory. - Highlights: • A methodology for ice accretion prediction using commercial software is proposed. • Euler model gives better prediction of airfoil water collection with detached flow. • A source term is used to change from a solid wall to a permeable wall in Fluent. • Energy needed for ice-accretion mitigation system is predicted.

  2. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation... computer software or commercial computer software documentation was obtained. (b) If the Government has a...

  3. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  4. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition....7202 Commercial computer software and commercial computer software documentation. ...

  5. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  6. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  7. Network Externality and Commercial Software Piracy

    OpenAIRE

    Sougata Poddar

    2005-01-01

    Contrary to the earlier findings under end-users piracy where the existence of strong network externality was shown to be a reason for allowing limited piracy, we find when the piracy is commercial in nature the optimal policy for the original software developer is to protect its product irrespective of the strength of network externality in the software users market.

  8. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  9. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  10. Integrating commercial software in accelerator control- case study

    International Nuclear Information System (INIS)

    Pace, Alberto

    1994-01-01

    Using existing commercial software is the dream of any control system engineer for the development cost reduction that can reach one order of magnitude. This dream often vanishes when appears the requirement to have a uniform and consistent architecture through a wide number of components and applications. This makes it difficult to integrate several commercial packages that often impose different user interface and communication standards. This paper will describe the approach and standards that have been chosen for the CERN ISOLDE control system that have allowed several commercial packages to be integrated in the system as-they-are permitting the software development cost to be reduced to a minimum. (author). 10 refs., 2 tabs., 9 figs

  11. The impact of commercial open source software on proprietary software producers and social welfare

    OpenAIRE

    Xing, Mingqing

    2014-01-01

    Purpose: A growing number of commercial open source software, based on free open source software, appears in many segments of software market. The purpose of this study is to investigate how commercial open source software affects proprietary software producer’s pricing (market share or profit), consumer surplus and social welfare. Design/methodology: To analyze the impact of commercial open source software on proprietary software producer, this study constructs two vertical-differentiation m...

  12. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  13. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  14. Financial Planning in Transit : Use of Commercially Available Microcomputer Software

    Science.gov (United States)

    1983-11-01

    This report addresses the potential of using commercially available microcomputer software for transit financial planning activities. Discussions with transit operators identified the need for inexpensive, easy to use software for ridership and fare ...

  15. Regional vegetation management standards for commercial pine ...

    African Journals Online (AJOL)

    Although the understanding gained from these trials allowed for the development of vegetation management standards, their operational and economic viability need to be tested on a commercial basis. Four pine trials were thus initiated to test the applicability of these standards when utilised on a commercial scale. Two of ...

  16. Diversification and Challenges of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  17. A company perspective on software engineering standards

    International Nuclear Information System (INIS)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them

  18. Freeware Versus Commercial Office Productivity Software

    Science.gov (United States)

    2016-12-01

    proprietary system to open source” (Vaughan-Nichols, 2009). It is reported that “98% of enterprise-level companies use open source software offerings...functionality of cloud computing’s five characteristics in a DOD environment. H. SUMMARY With the use of any available OSS or proprietary software...non- government agencies, and civilian companies . To navigate this new and open environment, the DOD can no longer rely on closed productivity systems

  19. Software Development Standard for Mission Critical Systems

    Science.gov (United States)

    2014-03-17

    6.2 for the OCD DID identifier. 5.3.3 System Requirements Definition 1. Based on the analysis of user needs, the operational concepts, and other...AEROSPACE REPORT NO. TR-RS-2015-00012 Software Development Standard for Mission Critical Systems March 17, 2014 Richard. J. Adams1, Suellen...Final 3. DATES COVERED - 4. TITLE AND SUBTITLE Software Development Standard for Mission Critical Systems 5a. CONTRACT NUMBER FA8802-14-C-0001

  20. Laboratory Connections. Commercial Interfacing Packages: Part II: Software and Applications.

    Science.gov (United States)

    Powers, Michael H.

    1989-01-01

    Describes the titration of a weak base with a strong acid and subsequent treatment of experimental data using commercially available software. Provides a BASIC program for determining first and second derivatives of data input. Lists 6 references. (YP)

  1. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  2. ESSCOTS for Learning: Transforming Commercial Software into Powerful Educational Tools.

    Science.gov (United States)

    McArthur, David; And Others

    1995-01-01

    Gives an overview of Educational Support Systems based on commercial off-the-shelf software (ESSCOTS), and discusses the benefits of developing such educational software. Presents results of a study that revealed the learning processes of middle and high school students who used a geographical information system. (JMV)

  3. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    OpenAIRE

    Jump, David

    2014-01-01

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing...

  4. Improvement of gamma calibration procedures with commercial management software

    International Nuclear Information System (INIS)

    Lucena, Rodrigo F.; Potiens, Maria da Penha A.; Santos, Gelson P.; Vivolo, Vitor

    2007-01-01

    In this work, the gamma calibration procedure of the Instruments Calibration Laboratory (LCI) of the IPEN-CNEN-SP was improved with the use of the commercial management software Autolab TM from Automa Company. That software was adapted for our specific use in the calibration procedures. The evaluation of the uncertainties in gamma calibration protocol was improved by the LCI staff and yet the all worksheets and final calibration report lay-out was developed in commercial software like Excell TM and Word TM from Microsft TM . (author)

  5. Improvement of gamma calibration procedures with commercial management software

    Energy Technology Data Exchange (ETDEWEB)

    Lucena, Rodrigo F.; Potiens, Maria da Penha A.; Santos, Gelson P.; Vivolo, Vitor [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mails: rodrigoifusp@yahoo.com.br; mppalbu@ipen.br; gpsantos@ipen.br; vivolo@ipen.br

    2007-07-01

    In this work, the gamma calibration procedure of the Instruments Calibration Laboratory (LCI) of the IPEN-CNEN-SP was improved with the use of the commercial management software Autolab{sup TM} from Automa Company. That software was adapted for our specific use in the calibration procedures. The evaluation of the uncertainties in gamma calibration protocol was improved by the LCI staff and yet the all worksheets and final calibration report lay-out was developed in commercial software like Excell{sup TM} and Word{sup TM} from Microsft{sup TM}. (author)

  6. Standardization of Software Application Development and Governance

    Science.gov (United States)

    2015-03-01

    Common lnfrastrucuture •Common SOA Stack •Common Commercial Technologies • Re-use of Common Services • Pattern discovery and Template Development...Institute SDK Software Development Kit SOA Service-Oriented Architecture WWW World Wide Web xv EXECUTIVE SUMMARY Department of Defense (DOD...reducing overhead across programs. Consequently, service-oriented architecture ( SOA ) strategies may support the alignment to cloud computing, and

  7. AN EVALUATION OF FIVE COMMERCIAL IMMUNOASSAY DATA ANALYSIS SOFTWARE SYSTEMS

    Science.gov (United States)

    An evaluation of five commercial software systems used for immunoassay data analysis revealed numerous deficiencies. Often, the utility of statistical output was compromised by poor documentation. Several data sets were run through each system using a four-parameter calibration f...

  8. Dilemmas within Commercial Involvement in Open Source Software

    DEFF Research Database (Denmark)

    Ciesielska, Malgorzata; Westenholz, Ann

    2016-01-01

    Purpose – The purpose of this paper is to contribute to the literature about the commercial involvement in open source software, levels of this involvement and consequences of attempting to mix various logics of action. Design/methodology/approach – This paper uses the case study approach based...... on mixed methods: literature reviews and news searches, electronic surveys, qualitative interviews and observations. It combines discussions from several research projects as well as previous publications to present the scope of commercial choices within open source software and their consequences....... Findings – The findings show that higher levels of involvement in open source software communities poses important questions about the balance between economic, technological, and social logics as well as the benefits of being autonomous, having access to collaborative networks and minimizing risks related...

  9. Sandia software guidelines. Volume 3. Standards, practices, and conventions

    Energy Technology Data Exchange (ETDEWEB)

    1986-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies software standards, conventions, and practices. These guidelines are the result of a collective effort within Sandia National Laboratories to define recommended deliverables and to document standards, practices, and conventions which will help ensure quality software. 66 refs., 5 figs., 6 tabs.

  10. Quench Simulation of Superconducting Magnets with Commercial Multiphysics Software

    CERN Document Server

    AUTHOR|(SzGeCERN)751171; Auchmann, Bernhard; Jarkko, Niiranen; Maciejewski, Michal

    The simulation of quenches in superconducting magnets is a multiphysics problem of highest complexity. Operated at 1.9 K above absolute zero, the material properties of superconductors and superfluid helium vary by several orders of magnitude over a range of only 10 K. The heat transfer from metal to helium goes through different transfer and boiling regimes as a function of temperature, heat flux, and transferred energy. Electrical, magnetic, thermal, and fluid dynamic effects are intimately coupled, yet live on vastly different time and spatial scales. While the physical models may be the same in all cases, it is an open debate whether the user should opt for commercial multiphysics software like ANSYS or COMSOL, write customized models based on general purpose network solvers like SPICE, or implement the physics models and numerical solvers entirely in custom software like the QP3, THEA, and ROXIE codes currently in use at the European Organisation for Nuclear Research (CERN). Each approach has its strengt...

  11. 78 FR 17875 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2013-03-25

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB59 Commercial Driver's License Testing and Commercial Learner's.... The 2011 final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial learner's permit...

  12. 77 FR 26989 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-08

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's... effective on July 8, 2011. That final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial...

  13. An IMRT dose distribution study using commercial verification software

    International Nuclear Information System (INIS)

    Grace, M.; Liu, G.; Fernando, W.; Rykers, K.

    2004-01-01

    Full text: The introduction of IMRT requires users to confirm that the isodose distributions and relative doses calculated by their planning system match the doses delivered by their linear accelerators. To this end the commercially available software, VeriSoft TM (PTW-Freiburg, Germany) was trialled to determine if the tools and functions it offered would be of benefit to this process. The CMS Xio (Computer Medical System) treatment planning system was used to generate IMRT plans that were delivered with an upgraded Elekta SL15 linac. Kodak EDR2 film sandwiched in RW3 solid water (PTW-Freiburg, Germany) was used to measure the IMRT fields delivered with 6 MV photons. The isodose and profiles measured with the film generally agreed to within ± 3% or ± 3 mm with the planned doses, in some regions (outside the IMRT field) the match fell to within ± 5%. The isodose distributions of the planning system and the film could be compared on screen and allows for electronic records of the comparison to be kept if so desired. The features and versatility of this software has been of benefit to our IMRT QA program. Furthermore, the VeriSoft TM software allows for quick and accurate, automated planar film analysis.Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  14. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  15. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  16. 76 FR 26853 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2011-05-09

    ... Safety Administration 49 CFR Parts 383, 384 and 385 Commercial Driver's License Testing and Commercial... Administration 49 CFR Parts 383, 384, and 385 [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's Permit Standards AGENCY: Federal Motor Carrier Safety...

  17. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  18. 75 FR 32983 - Commercial Driver's License (CDL) Standards: Exemption

    Science.gov (United States)

    2010-06-10

    ...-28480] Commercial Driver's License (CDL) Standards: Exemption AGENCY: Federal Motor Carrier Safety... commercial driver's license (CDL) as required by current regulations. FMCSA reviewed NAAA's application for... demonstrate alternatives its members would employ to ensure that their commercial motor vehicle (CMV) drivers...

  19. Anticipatory Standards and the Commercialization of Nanotechnology

    International Nuclear Information System (INIS)

    Rashba, Edward; Gamota, Daniel

    2003-01-01

    Standardization will play an increasing role in creating a smooth transition from the laboratory to the marketplace as products based on nanotechnology are developed and move into broad use. Traditionally, standards have evolved out of a need to achieve interoperability among existing products, create order in markets, simplify production and ensure safety. This view does not account for the escalating trend in standardization, especially in emerging technology sectors, in which standards working groups anticipate the evolution of a technology and facilitate its rapid development and entree to the market place. It is important that the nanotechnology community views standards as a vital tool to promote progress along the nanotechnology value chain - from nanoscale materials that form the building blocks for components and devices to the integration of these devices into functional systems.This paper describes the need for and benefits derived from developing consensus standards in nanotechnology, and how standards are created. Anticipatory standards can nurture the growth of nanotechnology by drawing on the lessons learned from a standards effort that has and continues to revolutionize the telecommunications industry. Also, a brief review is presented on current efforts in the US to create nanotechnology standards

  20. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  1. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  2. Transport behaviour of commercially available 100-Omega standard resistors

    CSIR Research Space (South Africa)

    Schumacher, B

    2001-04-01

    Full Text Available Several types of commercial 100-Omega resistors can be used with the cryogenic current comparator to maintain the resistance unit, derived from the Quantized Hall Effect (QHE), and to disseminate this unit to laboratory resistance standards. Up...

  3. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  4. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  5. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  6. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    Elphick, J.; Cope, H.

    1992-01-01

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  7. Performance comparison of the commercial CFD software for the prediction of turbulent flow through tube bundles

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-10-15

    Because turbulent flow through tube bundles can be found in many important industrial applications, such as PWR reactor, steam generator, CANDU calandria and lower plenum of the VHTR, extensive studies have been made both experimentally and numerically. Although recently licensing applications supported by commercial CFD software are increasing, there is no commercial CFD software which obtains a licensing from the regulatory body until now. Therefore, it is necessary to perform the systematic assessment for the prediction performance of the commercial CFD software. The main objective of the present study is to numerically simulate turbulent flow through both staggered and in line tube bundle using the two popular commercial CFD software, ANSYS CFX and FLUENT and to compare the simulation results with the experimental data for the assessment of these software's prediction performance.

  8. Application software standardizationCERN as an example

    CERN Document Server

    Welch, L C

    1979-01-01

    A method of standardizing universally useful software is discussed using CERN as an example of one such standard which is successful. A two-level standard is suggested wherein the lower level is coded in assembler at each participating site and trends to be subroutines which perform elementary tasks. The second level consists of routines which solve some more complex data handling or mathematical problems. Numerical results showing the triple benefit of smaller programs, faster programs while maintaining transportability is given. (1 refs).

  9. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Commercial computer software-Licensing. 1852.227-86 Section 1852.227-86 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND SPACE ADMINISTRATION CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 1852.227-86 Commercial...

  10. Standards guide for space and earth sciences computer software

    Science.gov (United States)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  11. Energy efficiency standards for residential and commercial equipment: Additional opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Rosenquist, Greg; McNeil, Michael; Iyer, Maithili; Meyers, Steve; McMahon, Jim

    2004-08-02

    Energy efficiency standards set minimum levels of energy efficiency that must be met by new products. Depending on the dynamics of the market and the level of the standard, the effect on the market for a given product may be small, moderate, or large. Energy efficiency standards address a number of market failures that exist in the buildings sector. Decisions about efficiency levels often are made by people who will not be responsible for the energy bill, such as landlords or developers of commercial buildings. Many buildings are occupied for their entire lives by very temporary owners or renters, each unwilling to make long-term investments that would mostly reward subsequent users. And sometimes what looks like apathy about efficiency merely reflects inadequate information or time invested to evaluate it. In addition to these sector-specific market failures, energy efficiency standards address the endemic failure of energy prices to incorporate externalities. In the U.S., energy efficiency standards for consumer products were first implemented in California in 1977. National standards became effective starting in 1988. By the end of 2001, national standards were in effect for over a dozen residential appliances, as well as for a number of commercial sector products. Updated standards will take effect in the next few years for several products. Outside the U.S., over 30 countries have adopted minimum energy performance standards. Technologies and markets are dynamic, and additional opportunities to improve energy efficiency exist. There are two main avenues for extending energy efficiency standards. One is upgrading standards that already exist for specific products. The other is adopting standards for products that are not covered by existing standards. In the absence of new and upgraded energy efficiency standards, it is likely that many new products will enter the stock with lower levels of energy efficiency than would otherwise be the case. Once in the stock

  12. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  13. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  14. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  15. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Preckshot, G.G.; Scott, J.A. [Lawrence Livermore National Lab., CA (United States)

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix.

  16. Capacity Management as a Service for Enterprise Standard Software

    Directory of Open Access Journals (Sweden)

    Hendrik Müller

    2017-12-01

    Full Text Available Capacity management approaches optimize component utilization from a strong technical perspective. In fact, the quality of involved services is considered implicitly by linking it to resource capacity values. This practice hinders to evaluate design alternatives with respect to given service levels that are expressed in user-centric metrics such as the mean response time for a business transaction. We argue that utilized historical workload traces often contain a variety of performance-related information that allows for the integration of performance prediction techniques through machine learning. Since enterprise applications excessively make use of standard software that is shipped by large software vendors to a wide range of customers, standardized prediction models can be trained and provisioned as part of a capacity management service which we propose in this article. Therefore, we integrate knowledge discovery activities into well-known capacity planning steps, which we adapt to the special characteristics of enterprise applications. Using a real-world example, we demonstrate how prediction models that were trained on a large scale of monitoring data enable cost-efficient measurement-based prediction techniques to be used in early design and redesign phases of planned or running applications. Finally, based on the trained model, we demonstrate how to simulate and analyze future workload scenarios. Using a Pareto approach, we were able to identify cost-effective design alternatives for an enterprise application whose capacity is being managed.

  17. Using commercially available off-the-shelf software and hardware to develop an intranet-based hypertext markup language teaching file.

    Science.gov (United States)

    Wendt, G J

    1999-05-01

    This presentation describes the technical details of implementing a process to create digital teaching files stressing the use of commercial off-the-shelf (COTS) software and hardware and standard hypertext markup language (HTML) to keep development costs to a minimum.

  18. Strapdown Airborne Gravimetry Using a Combination of Commercial Software and Stable-Platform Gravity Estimates

    DEFF Research Database (Denmark)

    Jensen, Tim E.; Nielsen, J. Emil; Olesen, Arne V.

    2017-01-01

    observations were combined with GNSS observations using the commercial software product “Inertial Explorer” from NovAtel’s Waypoint software suite, and it is shown how gravity estimates can be derived from these results. A statistical analysis of the crossover differences yields an RMS error of 2.5 mGal, which...... into the long-wavelengths of the gravity estimates. This has made the stable-platform approach the preferred method for geodetic applications. In the summer of 2016, during a large airborne survey in Malaysia, a SIMU system was flown alongside a traditional LaCoste&Romberg (LCR) gravimeter. The SIMU...

  19. 77 FR 30919 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-24

    ..., and 385 [Docket No. FMCSA-2007-27659] Commercial Driver's License Testing and Commercial Learner's... published a final rule titled ``Commercial Driver's License Testing and Commercial Learner's Permit... additional drivers, primarily those transporting certain tanks temporarily attached to the commercial motor...

  20. Commercial Pilot Practical Test Standards for Lighter-Than-Air Balloon, Airship

    Science.gov (United States)

    1997-05-01

    The Commercial Pilot Lighter-Than-Air Practical Test Standards (PTS) book has been published by the Federal Aviation Administration (FAA) to establish the standards for commercial pilot certification practical tests for the lighter-than-air category,...

  1. Formation of research group for standard nuclear engineering software development

    International Nuclear Information System (INIS)

    Okajima, Shigeaki; Sakamoto, Yukio

    2011-01-01

    JAEA has set up a new research group for Standard Nuclear Engineering Software Development in the Nuclear Science and Engineering Directorate in April, 2011. The paper introduces the aim and role of this new group in the computer simulation technology important in nuclear science and industrial development. Hitherto, they made efforts mainly to develop new computer codes and database in Japan. The new group is expected to maintain and modify the developed codes and database in accordance to the users needs. Evaluated Nuclear Data Library, Monte Carlo N-Particle Transport Code System, Evaluated Actinide Data, Heavy Ion Transport code System, Monte Carlo Codes for Neutron and Photon Transport Calculation and others are included for explanation. (S. Ohno)

  2. Commercial software upgrades may significantly alter Perfusion CT parameter values in colorectal cancer

    International Nuclear Information System (INIS)

    Goh, Vicky; Shastry, Manu; Endozo, Raymondo; Groves, Ashley M.; Engledow, Alec; Peck, Jacqui; Reston, Jonathan; Wellsted, David M.; Rodriguez-Justo, Manuel; Taylor, Stuart A.; Halligan, Steve

    2011-01-01

    To determine how commercial software platform upgrades impact on derived parameters for colorectal cancer. Following ethical approval, 30 patients with suspected colorectal cancer underwent Perfusion CT using integrated 64 detector PET/CT before surgery. Analysis was performed using software based on modified distributed parameter analysis (Perfusion software version 4; Perfusion 4.0), then repeated using the previous version (Perfusion software version 3; Perfusion 3.0). Tumour blood flow (BF), blood volume (BV), mean transit time (MTT) and permeability surface area product (PS) were determined for identical regions-of-interest. Slice-by-slice and 'whole tumour' variance was assessed by Bland-Altman analysis. Mean BF, BV and PS was 20.4%, 59.5%, and 106% higher, and MTT 14.3% shorter for Perfusion 4.0 than Perfusion 3.0. The mean difference (95% limits of agreement) were +13.5 (-44.9 to 72.0), +2.61 (-0.06 to 5.28), -1.23 (-6.83 to 4.36), and +14.2 (-4.43 to 32.8) for BF, BV, MTT and PS respectively. Within subject coefficient of variation was 36.6%, 38.0%, 27.4% and 60.6% for BF, BV, MTT and PS respectively indicating moderate to poor agreement. Software version upgrades of the same software platform may result in significantly different parameter values, requiring adjustments for cross-version comparison. (orig.)

  3. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new......This paper presents an overview of selected new modelling algorithms and capabilities in commercial software tools developed by TICRA. A major new area is design and analysis of printed reflectarrays where a fully integrated design environment is under development, allowing fast and accurate...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...

  4. Dosimetric and workflow evaluation of first commercial synthetic CT software for clinical use in pelvis

    Science.gov (United States)

    Tyagi, Neelam; Fontenla, Sandra; Zhang, Jing; Cloutier, Michelle; Kadbi, Mo; Mechalakos, Jim; Zelefsky, Michael; Deasy, Joe; Hunt, Margie

    2017-04-01

    To evaluate a commercial synthetic CT (syn-CT) software for use in prostate radiotherapy. Twenty-five prostate patients underwent CT and MR simulation scans in treatment position on a 3T MR scanner. A commercially available MR protocol was used that included a T2w turbo spin-echo sequence for soft-tissue contrast and a dual echo 3D mDIXON fast field echo (FFE) sequence for generating syn-CT. A dual-echo 3D FFE B 0 map was used for patient-induced susceptibility distortion analysis and a new 3D balanced-FFE sequence was evaluated for identification of implanted gold fiducial markers and subsequent image-guidance during radiotherapy delivery. Tissues were classified as air, adipose, water, trabecular/spongy bone and compact/cortical bone and assigned bulk HU values. The accuracy of syn-CT for treatment planning was analyzed by transferring the structures and plan from planning CT to syn-CT and recalculating the dose. Accuracy of localization at the treatment machine was evaluated by comparing registration of kV radiographs to either digitally reconstructed radiographs (DRRs) generated from syn-CT or traditional DRRs generated from the planning CT. Similarly, accuracy of setup using CBCT and syn-CT was compared to that using the planning CT. Finally, a MR-only simulation workflow was established and end-to-end testing was completed on five patients undergoing MR-only simulation. Dosimetric comparison between the original CT and syn-CT plans was within 0.5% on average for all structures. The de-novo optimized plans on the syn-CT met institutional clinical objectives for target and normal structures. Patient-induced susceptibility distortion based on B 0 maps was within 1 mm and 0.5 mm in the body and prostate respectively. DRR and CBCT localization based on MR-localized fiducials showed a standard deviation of  <1 mm. End-to-end testing and MR simulation workflow was successfully validated. MRI derived synthetic CT can be successfully used for a MR

  5. 76 FR 39018 - Commercial Driver's License Testing and Commercial Learner's Permit Standards; Corrections

    Science.gov (United States)

    2011-07-05

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's..., 2011, that will be effective on July 8, 2011. This final rule amends the commercial driver's license... to issue the commercial learner's permit (CLP). Since the final rule was published, FMCSA identified...

  6. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    Energy Technology Data Exchange (ETDEWEB)

    Kadoya, Noriyuki, E-mail: kadoya.n@rad.med.tohoku.ac.jp [Department of Radiation Oncology, Tohoku University Graduate School of Medicine, Sendai (Japan); Nakajima, Yujiro; Saito, Masahide [Department of Radiation Oncology, Tohoku University Graduate School of Medicine, Sendai (Japan); Miyabe, Yuki [Department of Radiation Oncology and Image-Applied Therapy, Kyoto University Graduate School of Medicine, Kyoto (Japan); Kurooka, Masahiko [Department of Radiation Oncology, Kanagawa Cancer Center, Yokohama (Japan); Kito, Satoshi [Department of Radiotherapy, Tokyo Metropolitan Cancer and Infectious Diseases Center, Komagome Hospital, Tokyo (Japan); Fujita, Yukio [Department of Radiation Oncology, Tokai University School of Medicine, Hachioji (Japan); Sasaki, Motoharu [Department of Radiological Technology, Tokushima University Hospital, Tokushima (Japan); Arai, Kazuhiro [Department of Radiation Physics and Technology, Southern Tohoku Proton Therapy Center, Koriyama (Japan); Tani, Kensuke [Department of Radiation Oncology, St Luke' s International Hospital, Tokyo (Japan); Yagi, Masashi [Department of Carbon Ion Radiotherapy, Osaka University Graduate School of Medicine, Suita (Japan); Wakita, Akihisa [Department of Radiation Oncology, National Cancer Center Hospital, Tokyo (Japan); Tohyama, Naoki [Department of Radiation Oncology, Tokyo Bay Advanced Imaging and Radiation Oncology Clinic Makuhari, Chiba (Japan); Jingu, Keiichi [Department of Radiation Oncology, Tohoku University Graduate School of Medicine, Sendai (Japan)

    2016-10-01

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D

  7. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    International Nuclear Information System (INIS)

    Kadoya, Noriyuki; Nakajima, Yujiro; Saito, Masahide; Miyabe, Yuki; Kurooka, Masahiko; Kito, Satoshi; Fujita, Yukio; Sasaki, Motoharu; Arai, Kazuhiro; Tani, Kensuke; Yagi, Masashi; Wakita, Akihisa; Tohyama, Naoki; Jingu, Keiichi

    2016-01-01

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D

  8. Increasing software testability with standard access and control interfaces

    Science.gov (United States)

    Nikora, Allen P; Some, Raphael R.; Tamir, Yuval

    2003-01-01

    We describe an approach to improving the testability of complex software systems with software constructs modeled after the hardware JTAG bus, used to provide visibility and controlability in testing digital circuits.

  9. Space and Missile Systems Center Standard: Software Development

    Science.gov (United States)

    2015-01-16

    Glossary: Defense Acquisition Acronyms and Terms, Eleventh Edition, September 2003. Dixon 2006. Dixon, J. M., C. M. Rink, and C. V. Sather, Digital ASIC ...Circuits ( ASICs ) and Field-Programmable Gate Arrays (FPGAs), see (Sather 2010) and (Dixon 2006). 4.1 Software Development Process The framework used...members performing software-related work on the contract. 2. Each software team member shall enforce the compliance of all subordinate software

  10. Issues and relationships among software standards for nuclear safety applications. Version 2.0

    International Nuclear Information System (INIS)

    Scott, J.A.; Preckshot, G.G.; Lawrence, J.D.; Johnson, G.L.

    1996-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the development of draft regulatory guides for selected software engineering standards. This report describes the results of the initial task in this work. The selected software standards and a set of related software engineering standards were reviewed, and the resulting preliminary elements of the regulatory positions are identified in this report. The importance of a thorough understanding of the relationships among standards useful for developing safety-related software is emphasized. The relationship of this work to the update of the Standard Review Plan is also discussed

  11. Evaluation of mass spectral library search algorithms implemented in commercial software.

    Science.gov (United States)

    Samokhin, Andrey; Sotnezova, Ksenia; Lashin, Vitaly; Revelsky, Igor

    2015-06-01

    Performance of several library search algorithms (against EI mass spectral databases) implemented in commercial software products ( acd/specdb, chemstation, gc/ms solution and ms search) was estimated. Test set contained 1000 mass spectra, which were randomly selected from NIST'08 (RepLib) mass spectral database. It was shown that composite (also known as identity) algorithm implemented in ms search (NIST) software gives statistically the best results: the correct compound occupied the first position in the list of possible candidates in 81% of cases; the correct compound was within the list of top ten candidates in 98% of cases. It was found that use of presearch option can lead to rejection of the correct answer from the list of possible candidates (therefore presearch option should not be used, if possible). Overall performance of library search algorithms was estimated using receiver operating characteristic curves. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  13. Software-Defined Solutions for Managing Energy Use in Small to Medium Sized Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Peffer, Therese [Univ. of California, Berkeley, CA (United States); Council on International Education Exchange (CIEE), Portland, ME (United States); Blumstein, Carl [Council on International Education Exchange (CIEE), Portland, ME (United States); Culler, David [Univ. of California, Berkeley, CA (United States). Electrical Engineering and Computer Sciences (EECS); Modera, Mark [Univ. of California, Davis, CA (United States). Western Cooling Efficiency Center (WCEC); Meier, Alan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-09-10

    The Project uses state-of-the-art computer science to extend the benefits of Building Automation Systems (BAS) typically found in large buildings (>100,000 square foot) to medium-sized commercial buildings (<50,000 sq ft). The BAS developed in this project, termed OpenBAS, uses an open-source and open software architecture platform, user interface, and plug-and-play control devices to facilitate adoption of energy efficiency strategies in the commercial building sector throughout the United States. At the heart of this “turn key” BAS is the platform with three types of controllers—thermostat, lighting controller, and general controller—that are easily “discovered” by the platform in a plug-and-play fashion. The user interface showcases the platform and provides the control system set-up, system status display and means of automatically mapping the control points in the system.

  14. Social network forensics: using commercial software in a university forensics lab environment

    Science.gov (United States)

    Halkin, Pavel; Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this article is to give a practical overview of forensic investigation of social networks cases using certain commercial software packages in a university forensics lab environment. Students have to learn the usefulness of forensic procedures to ensure evidence collection, evidence preservation, forensic analysis, and reporting. It is demonstrated how to investigate important data from social network users. Different scenarios of investigations are presented that are well-suited for forensics lab work in university. In particular, we focus on the new version of Belkasoft Evidence Center and compare it with other well-known tools regarding functionality, usability and capabilities.

  15. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    Science.gov (United States)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  16. Software Verification and Validation for Commercial Statistical Packages Utilized by the Statistical Consulting Section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2001-01-16

    The purpose of this report is to provide software verification and validation (v and v) for the statistical packages utilized by the Statistical Consulting Section (SCS) of the Savannah River Technology Center (SRTC). The need for this v and v stems from the requirements of the Quality Assurance (QA) programs that are frequently applicable to the work conducted by SCS. This document is designed to comply with software QA requirements specified in the 1Q Manual Quality Assurance Procedure 20-1, Revision 6. Revision 1 of this QA plan adds JMP Version 4 to the family of (commercially-available) statistical tools utilized by SCS. JMP Version 3.2.2 is maintained as a support option due to features unique to this version of JMP that have not as yet been incorporated into Version 4. SCS documents that include JMP output should provide a clear indication of the version or versions of JMP that were used. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore, th e software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are introduced into the Statistical Consulting Section, the appropriate problems from this report are to be re-evaluated, and this report is to be revised to address their verification and validation.

  17. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  18. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    Science.gov (United States)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  19. An evaluation of the documented requirements of the SSP UIL and a review of commercial software packages for the development and testing of UIL prototypes

    Science.gov (United States)

    Gill, Esther Naomi

    1986-01-01

    A review was conducted of software packages currently on the market which might be integrated with the interface language and aid in reaching the objectives of customization, standardization, transparency, reliability, maintainability, language substitutions, expandability, portability, and flexibility. Recommendations are given for best choices in hardware and software acquisition for inhouse testing of these possible integrations. Software acquisition in the line of tools to aid expert-system development and/or novice program development, artificial intelligent voice technology and touch screen or joystick or mouse utilization as well as networking were recommended. Other recommendations concerned using the language Ada for the user interface language shell because of its high level of standardization, structure, and ability to accept and execute programs written in other programming languages, its DOD ownership and control, and keeping the user interface language simple so that multiples of users will find the commercialization of space within their realm of possibility which is, after all, the purpose of the Space Station.

  20. Comparison of ISO 9000 and recent software life cycle standards to nuclear regulatory review guidance

    Energy Technology Data Exchange (ETDEWEB)

    Preckshot, G.G.; Scott, J.A.

    1998-01-20

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the assessment of certain quality and software life cycle standards to determine whether additional guidance for the U.S. nuclear regulatory context should be derived from the standards. This report describes the nature of the standards and compares the guidance of the standards to that of the recently updated Standard Review Plan.

  1. Comparison of ISO 9000 and recent software life cycle standards to nuclear regulatory review guidance

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1998-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the assessment of certain quality and software life cycle standards to determine whether additional guidance for the U.S. nuclear regulatory context should be derived from the standards. This report describes the nature of the standards and compares the guidance of the standards to that of the recently updated Standard Review Plan

  2. Round table discussion: Quality control and standardization of nuclear medicine software

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    In summary the round table came to the following important conclusions: Nuclear medicine software systems need better documentation, especially regarding details of algorithms and limitations, and user friendliness could be considerably improved. Quality control of software is an integral part of quality assurance in nuclear medicine and should be performed at all levels of the software. Quality control of applications software should preferably be performed with assistance of generally accepted software phantoms. A basic form of standardization was welcomed and partly regarded as essential by all participants. Some areas such as patient study files could be standardized in the near future, whereas other areas such as the standardization of clinical applications programs or acquisition protocols still present major difficulties. An international cooperation in the field of standardization of software and other topics has already been started on the European level and should be continued and supported. (orig.)

  3. On the possibility of using commercial software packages for thermoluminescence glow curve deconvolution analysis

    International Nuclear Information System (INIS)

    Pagonis, V.; Kitis, G.

    2002-01-01

    This paper explores the possibility of using commercial software for thermoluminescence glow curve deconvolution (GCD) analysis. The program PEAKFIT has been used to perform GCD analysis of complex glow curves of quartz and dosimetric materials. First-order TL peaks were represented successfully using the Weibull distribution function. Second-order and general-order TL peaks were represented accurately by using the Logistic asymmetric functions with varying symmetry parameters. Analytical expressions were derived for determining the energy E from the parameters of the Logistic asymmetric functions. The accuracy of these analytical expressions for E was tested for a wide variety of kinetic parameters and was found to be comparable to the commonly used expressions in the TL literature. The effectiveness of fit the analytical functions used here was tested using the figure of merit and was found to be comparable to the accuracy of recently published GCD expressions for first- and general-order kinetics. (author)

  4. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... find that the customizing company and software customizations depend not only on initiatives which are set off internally in the company, but on how the ustomizing organization’s inter-organizational network and interaction with other organizations is built up. The case company has built its network...

  5. On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic; Huang, Zhenyu; Wang, Lei; Wu, Di; Chen, Yousu

    2017-05-01

    Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model library in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.

  6. IEEE [Institute of Electrical and Electronics Engineers] standards and nuclear software quality engineering

    International Nuclear Information System (INIS)

    Daughtrey, T.

    1988-01-01

    Significant new nuclear-specific software standards have recently been adopted under the sponsorship of the American Nuclear Society and the American Society of Mechanical Engineers. The interest of the US Nuclear Regulatory Commission has also been expressed through their issuance of NUREG/CR-4640. These efforts all indicate a growing awareness of the need for thorough, referenceable expressions of the way to build in and evaluate quality in nuclear software. A broader professional perspective can be seen in the growing number of software engineering standards sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Computer Society. This family of standards represents a systematic effort to capture professional consensus on quality practices throughout the software development life cycle. The only omission-the implementation phase-is treated by accepted American National Standards Institute or de facto standards for programming languages

  7. [Development of a software standardizing optical density with operation settings related to several limitations].

    Science.gov (United States)

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  8. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  9. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  10. An investigation into drug-related problems identifiable by commercial medication review software

    Directory of Open Access Journals (Sweden)

    Colin Curtain

    2013-05-01

    Full Text Available BackgroundAccredited pharmacists conduct home medicines reviews (HMRs to detect and resolve potential drug-related problems (DRPs. A commercial expert system, Medscope Review Mentor (MRM, has been developed to assist pharmacists in the detection and resolution of potential DRPs.AimsThis study compares types of DRPs identified with the commercial system which uses multiple classification ripple down rules (MCRDR with the findings of pharmacists.Method HMR data from 570 reviews collected from accredited pharmacists was entered into MRM and the DRPs were identified. A list of themes describing the main concept of each DRP identified by MRM was developed to allow comparison with pharmacists. Theme types, frequencies, similarity and dissimilarity were explored.ResultsThe expert system was capable of detecting a wide range of potential DRPs: 2854 themes; compared to pharmacists: 1680 themes. The system identified the same problems as pharmacists in many patient cases. Ninety of 119 types of themes identifiable by pharmacists were also identifiable by software. MRM could identify the same problems in the same patients as pharmacists for 389 problems, resulting in a low overlap of similarity with an averaged Jaccard Index of 0.09. ConclusionMRM found significantly more potential DRPs than pharmacists. MRM identified a wide scope of DRPs approaching the range of DRPs that were identified by pharmacists. Differences may be associated with system consistency and perhaps human oversight or human selective prioritisation. DRPs identified by the system were still considered relevant even though the system identified a larger number of problems.

  11. Software architecture standard for simulation virtual machine, version 2.0

    Science.gov (United States)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  12. 75 FR 33663 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Renewal of Exemption

    Science.gov (United States)

    2010-06-14

    ... Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo Trucks.... ACTION: Notice of renewal of exemption; request for comments. SUMMARY: FMCSA renews Volvo Trucks North America's (Volvo) exemption from the Agency's requirement for certain drivers of commercial motor vehicles...

  13. 75 FR 47880 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Renewal of Exemption

    Science.gov (United States)

    2010-08-09

    ... Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo Trucks.... ACTION: Notice of final disposition. SUMMARY: FMCSA announces its decision to continue in effect Volvo Trucks North America's (Volvo) exemption for five of its drivers to enable them to test-drive commercial...

  14. 75 FR 45198 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Renewal of Exemption

    Science.gov (United States)

    2010-08-02

    ... Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo Trucks... requirement to hold a commercial driver's license (CDL) submitted by Volvo Trucks North America (Volvo) on behalf of an employee. Volvo requested renewal of the CDL exemption for a Swedish engineer employed by...

  15. 76 FR 25761 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Renewal of Exemption

    Science.gov (United States)

    2011-05-05

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo... from the requirement to hold a commercial driver's license (CDL) sought by Volvo Trucks North America (Volvo) on behalf of five employees. Volvo requested renewal of the CDL exemption for five Swedish...

  16. 75 FR 33662 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Inc.'s Exemption...

    Science.gov (United States)

    2010-06-14

    ... Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo Trucks... announces its decision to grant Volvo Trucks North America, Inc.'s (Volvo) application for an exemption for two Volvo drivers to drive commercial motor vehicles (CMVs) in the United States without possessing...

  17. Competing Compatibility Standards and Network Externalities in the PC Software Market.

    OpenAIRE

    Gandal, Neil

    1995-01-01

    This paper is an empirical study of the value of four file compatibility standards for transferring data in the personal computer software market. The results are that only the LOTUS file compatibility standard is significant in explaining price variations and it is significant in both the spreadsheet and database management system markets. This supports the hypothesis that the personal computer software market exhibits complementary network externalities. Copyright 1995 by MIT Press.

  18. The Relationship between Transformational and Transactional Leadership Styles and Innovation Commitment and Output at Commercial Software Companies

    Science.gov (United States)

    Golla, Eric James

    2012-01-01

    The purpose of this quantitative study was to discover whether relationships exist between leadership styles and innovation commitment and innovation output at commercial software companies. The leadership styles included transformational and transactional, and the innovation variables included (a) the percentage of expenses allocated to…

  19. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  20. MRI/TRUS fusion software-based targeted biopsy: the new standard of care?

    Science.gov (United States)

    Manfredi, M; Costa Moretti, T B; Emberton, M; Villers, A; Valerio, M

    2015-09-01

    The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.

  1. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  2. 76 FR 67480 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-11-01

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... existing Standard on Commercial Diving Operations (29 CFR part 1910, Subpart [[Page 67481

  3. How Modeling Standards, Software, and Initiatives Support Reproducibility in Systems Biology and Systems Medicine.

    Science.gov (United States)

    Waltemath, Dagmar; Wolkenhauer, Olaf

    2016-10-01

    Only reproducible results are of significance to science. The lack of suitable standards and appropriate support of standards in software tools has led to numerous publications with irreproducible results. Our objectives are to identify the key challenges of reproducible research and to highlight existing solutions. In this paper, we summarize problems concerning reproducibility in systems biology and systems medicine. We focus on initiatives, standards, and software tools that aim to improve the reproducibility of simulation studies. The long-term success of systems biology and systems medicine depends on trustworthy models and simulations. This requires openness to ensure reusability and transparency to enable reproducibility of results in these fields.

  4. Comparison of BUT Big 6 commercial turkey production performance in Iran with the breed standards

    Directory of Open Access Journals (Sweden)

    P Haghighi Khoshkhoo

    2010-05-01

    Full Text Available In order to evaluate the management of commercial turkey production and finding existing problems in Iran, 15 commercial flocks of BUT Big6 breed (including 72000 commercial turkeys were selected randomly and monitored up to marketing age. In this study flock performances based on mean body weight and its coefficient of variation percentage, feed consumption, feed conversion rate, livability percentage and European Efficacy Factor were recorded in 15 flocks for each sex individually and statistically analyzed by SPSS program .The results showed that the mean body weight, livability percentage and European Efficacy Factor in all flocks were significantly lower than the breed's standards (P

  5. Accounting treatment of software development costs according to applicable accounting standards

    Directory of Open Access Journals (Sweden)

    Dilyana Markova

    2017-05-01

    Full Text Available The growth of the software sector worldwide is ahead of the creation and updating of accounting standards that regulate the reporting of the products and services it creates. Applicable standards across countries are interpreted differently and that lead to incomplete reports. This impose the adoption and application of explanations to give a specific guidelines and rules on the accounting treatment of R & D expenditure at each phase of the software project life cycle and disclosure of the information in the financial statements.

  6. Comparison of different types of commercial filtered backprojection and ordered-subset expectation maximization SPECT reconstruction software.

    Science.gov (United States)

    Seret, Alain; Forthomme, Julien

    2009-09-01

    The aim of this study was to compare the performance of filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM) reconstruction algorithms available in several types of commercial SPECT software. Numeric simulations of SPECT acquisitions of 2 phantoms were used: the National Electrical Manufacturers Association line phantom used for the assessment of SPECT resolution and a phantom with uniform, hot-rod, and cold-rod compartments. For FBP, no filtering and filtering of the projections with either a Butterworth filter (order 3 or 6) or a Hanning filter at various cutoff frequencies were considered. For OSEM, the number of subsets was 1, 4, 8, or 16, and the number of iterations was chosen to obtain a product number of iterations times the number of subsets equal to 16, 32, 48, or 64. The line phantom enabled us to obtain the reconstructed central, radial, and tangential full width at half maximum. The uniform compartment of the second phantom delivered the reconstructed mean pixel counts and SDs from which the coefficients of variation were calculated. Hot contrast and cold contrast were obtained from its rod compartments. For FBP, the full width at half maximum, mean pixel count, coefficient of variation, and contrast were almost software independent. The only exceptions were a smaller (by 0.5 mm) full width at half maximum for one of the software types, higher mean pixel counts for 2 of the software types, and better contrast for 2 of the software types under some filtering conditions. For OSEM, the full width at half maximum differed by 0.1-2.5 mm with the different types of software but was almost independent of the number of subsets or iterations. There was a marked dependence of the mean pixel count on the type of software used, and there was a moderate dependence of the coefficient of variation. Contrast was almost software independent. The mean pixel count varied greatly with the number of iterations for 2 of the software types, and

  7. Accuracy of noninvasive coronary stenosis quantification of different commercially available dedicated software packages.

    Science.gov (United States)

    Dikkers, Riksta; Willems, Tineke P; de Jonge, Gonda J; Marquering, Henk A; Greuter, Marcel J W; van Ooijen, Peter M A; van der Weide, Marijke C Jansen; Oudkerk, Matthijs

    2009-01-01

    The purpose of this study was to investigate the noninvasive quantification of coronary artery stenosis using cardiac software packages and vessel phantoms with known stenosis severity. Four different sizes of vessel phantoms were filled with contrast agent and scanned on a 64-slice multidetector computed tomography. Diameter and area stenosis were evaluated by 2 observers blinded from the true measures using 5 different software packages. Measurements were compared with the true measure of the vessel phantoms. The absolute difference in stenosis measurements and intraobserver and interobserver variabilities were assessed. All software packages show a trend toward larger differences for the smaller vessel phantoms. The absolute difference of the automatic measurements was significantly higher compared with that of the manual measurements in all 5 evaluated software packages for all vessel phantoms (P < 0.05). Manual stenosis measurements are significantly more accurate compared with automatic measurements, and therefore, manual adjustments are still essential for noninvasive assessment of coronary artery stenosis.

  8. Available to commercial market, government 'secret' software integrates 70 health applications.

    Science.gov (United States)

    1997-12-01

    Government's secret weapon is yours for just $25! The Department of Veterans Affairs VISTA software system could be the answer to less expensive systems integration for many health care organizations. That's what a Georgia hospital has found since installing the core of the software, including the surgery module that tracks a variety of data and outputs nine different types of reports. Here are the details on the product, and how you can order it.

  9. Software Quality and Testing: What DoD Can Learn from Commercial Practices

    Science.gov (United States)

    1992-08-31

    Defects, and Correction of Processes PROCESS IMPROVEMENT ...... I,- DEVELOPMENT ] STEST"ING -- [ PROCESO IMPROVEMENT 1 Figure 1. Software Quality Control...only when users understand the manual procedure the tool will automate, and the benefit of automating it. With regard to software testing in DoD, we can...testing - the process of exercising or evaluating a system or system components by manual or automnated means to veiify that it satisfies specified

  10. Academic and Non-Profit Accessibility to Commercial Remote Sensing Software

    Science.gov (United States)

    O'Connor, A. S.; Farr, B.

    2013-12-01

    Remote Sensing as a topic of teaching and research at the university and college level continues to increase. As more data is made freely available and software becomes easier to use, more and more academic and non-profits institutions are turning to remote sensing to solve their tough and large spatial scale problems. Exelis Visual Information Solutions (VIS) has been supporting teaching and research endeavors for over 30 years with a special emphasis over the last 5 years with scientifically proven software and accessible training materials. The Exelis VIS academic program extends to US and Canadian 2 year and 4 year colleges and universities with tools for analyzing aerial and satellite multispectral and hyperspectral imagery, airborne LiDAR and Synthetic Aperture Radar. The Exelis VIS academic programs, using the ENVI Platform, enables labs and classrooms to be outfitted with software and makes software accessible to students. The ENVI software provides students hands on experience with remote sensing software, an easy teaching platform for professors and allows researchers scientifically vetted software they can trust. Training materials are provided at no additional cost and can either serve as a basis for course curriculum development or self paced learning. Non-profit organizations like The Nature Conservancy (TNC) and CGIAR have deployed ENVI and IDL enterprise wide licensing allowing researchers all over the world to have cost effective access COTS software for their research. Exelis VIS has also contributed licenses to the NASA DEVELOP program. Exelis VIS is committed to supporting the academic and NGO community with affordable enterprise licensing, access to training materials, and technical expertise to help researchers tackle today's Earth and Planetary science big data challenges.

  11. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  12. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  13. Analysis in the Utility of Commercial Wargaming Simulation Software for Army Organizational Leadership Development

    National Research Council Canada - National Science Library

    Macintyre, Kerry

    2000-01-01

    ... analysis, operational test and evaluation, and campaign development. The intent of this monograph was to determine if commercial wargame simulations could be used to develop the organizational leadership abilities of Army officers...

  14. Evaluating the Relation Between Coding Standard Violations and Faults Within and Across Software Versions

    NARCIS (Netherlands)

    Boogerd, C.; Moonen, L.

    2009-01-01

    In spite of the widespread use of coding standards and tools enforcing their rules, there is little empirical evidence supporting the intuition that they prevent the introduction of faults in software. In previous work, we performed a pilot study to assess the relation between rule violations and

  15. A GPP-based Software-Defined Radio Front-end for WLAN Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Arkesteijn, V.J.; Slump, Cornelis H.; Klumperink, Eric A.M.; Nauta, Bram

    2004-01-01

    This paper presents a software-defined radio testbed for the physical layer of wireless LAN standards. All baseband physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in real-time. This has been tested in combination with a CMOS integrated

  16. Leveraging Software Architectures through the ISO/IEC 42010 standard: A Feasibility Study

    NARCIS (Netherlands)

    Tamburri, D.A.; Lago, P.; Muccini, H.; Proper, E.; Lankhorst, M.; Schoenherr, M.

    2011-01-01

    The state of the practice in enterprise and software architecture learnt that relevant architectural aspects should be illustrated in multiple views, targeting the various concerns of different stakeholders. This has been expressed a.o. in the ISO/IEC 42010 Standard on architecture descriptions. In

  17. 76 FR 4412 - Commercial Driver's License (CDL) Standards; Volvo Trucks North America, Renewal of Exemption

    Science.gov (United States)

    2011-01-25

    ... Federal Motor Carrier Safety Administration Commercial Driver's License (CDL) Standards; Volvo Trucks.... ACTION: Notice of final disposition. SUMMARY: FMCSA announces its final decision regarding Volvo Trucks North America's (Volvo) application for an exemption for Andreas Hamsten to enable him to continue to...

  18. 76 FR 3587 - Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial...

    Science.gov (United States)

    2011-01-20

    ... Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial-Institutional, and... Fossil fuel-fired electric utility steam generating units. Federal Government 22112 Fossil fuel-fired... 22112 Fossil fuel-fired electric utility steam generating units owned by municipalities. 921150 Fossil...

  19. 76 FR 3517 - Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial...

    Science.gov (United States)

    2011-01-20

    ... Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial-Institutional, and... following: Category NAICS \\1\\ Examples of regulated entities Industry 221112 Fossil fuel-fired electric utility steam generating units. Federal Government 22112 Fossil fuel-fired electric utility steam...

  20. Improved detection of pulmonary nodules on energy-subtracted chest radiographs with a commercial computer-aided diagnosis software: comparison with human observers

    International Nuclear Information System (INIS)

    Szucs-Farkas, Zsolt; Patak, Michael A.; Yuksel-Hatz, Seyran; Ruder, Thomas; Vock, Peter

    2010-01-01

    To retrospectively analyze the performance of a commercial computer-aided diagnosis (CAD) software in the detection of pulmonary nodules in original and energy-subtracted (ES) chest radiographs. Original and ES chest radiographs of 58 patients with 105 pulmonary nodules measuring 5-30 mm and images of 25 control subjects with no nodules were randomized. Five blinded readers evaluated firstly the original postero-anterior images alone and then together with the subtracted radiographs. In a second phase, original and ES images were analyzed by a commercial CAD program. CT was used as reference standard. CAD results were compared to the readers' findings. True-positive (TP) and false-positive (FP) findings with CAD on subtracted and non-subtracted images were compared. Depending on the reader's experience, CAD detected between 11 and 21 nodules missed by readers. Human observers found three to 16 lesions missed by the CAD software. CAD used with ES images produced significantly fewer FPs than with non-subtracted images: 1.75 and 2.14 FPs per image, respectively (p=0.029). The difference for the TP nodules was not significant (40 nodules on ES images and 34 lesions in non-subtracted radiographs, p = 0.142). CAD can improve lesion detection both on energy subtracted and non-subtracted chest images, especially for less experienced readers. The CAD program marked less FPs on energy-subtracted images than on original chest radiographs. (orig.)

  1. Computer systems and software description for Standard-E+ Hydrogen Monitoring System (SHMS-E+)

    International Nuclear Information System (INIS)

    Tate, D.D.

    1997-01-01

    The primary function of the Standard-E+ Hydrogen Monitoring System (SHMS-E+) is to determine tank vapor space gas composition and gas release rate, and to detect gas release events. Characterization of the gas composition is needed for safety analyses. The lower flammability limit, as well as the peak burn temperature and pressure, are dependent upon the gas composition. If there is little or no knowledge about the gas composition, safety analyses utilize compositions that yield the worst case in a deflagration or detonation. Knowledge of the true composition could lead to reductions in the assumptions and therefore there may be a potential for a reduction in controls and work restrictions. Also, knowledge of the actual composition will be required information for the analysis that is needed to remove tanks from the Watch List. Similarly, the rate of generation and release of gases is required information for performing safety analyses, developing controls, designing equipment, and closing safety issues. This report outlines the computer system design layout description for the Standard-E+ Hydrogen Monitoring System

  2. EXPERIENCE API – NEW STANDARD OF E-LEARNING SOFTWARE AND EXAMPLES OF ITS PRACTICAL USE

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2016-07-01

    Full Text Available The purpose of the article is to analyze features of the new standard of e-learning software - Experience API (xAPI, previously also known as the Tin Can API. The standard defines a way of interaction between xAPI-clients – software which students work with while e-learning process, and xAPI-servers – Learning Record Stores (LRS, which store data about their results. Standard also defines LRS data representation format and a way of data transfers between LRS, which makes it possible to combine several LRS into distributed database that could accumulate information about people training in formal, non-formal and informal education throughout life. The article contains review of available xAPI-clients, xAPI-servers, and the results of their testing, which prove the possibility of their usage in our educational institutions.

  3. The Implications of Incumbent Intellectual Property Strategies for Open Source Software Success and Commercialization

    Science.gov (United States)

    Wen, Wen

    2012-01-01

    While open source software (OSS) emphasizes open access to the source code and avoids the use of formal appropriability mechanisms, there has been little understanding of how the existence and exercise of formal intellectual property rights (IPR) such as patents influence the direction of OSS innovation. This dissertation seeks to bridge this gap…

  4. Software engineering of a navigation and guidance system for commercial aircraft

    Science.gov (United States)

    Lachmann, S. G.; Mckinstry, R. G.

    1975-01-01

    The avionics experimental configuration of the considered system is briefly reviewed, taking into account the concept of an advanced air traffic management system, flight critical and noncritical functions, and display system characteristics. Cockpit displays and the navigation computer are examined. Attention is given to the functions performed in the navigation computer, major programs in the navigation computer, and questions of software development.

  5. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  6. Real time access to commercial microwave link data: Details of the data acquisition software, the database and its web frontend

    Science.gov (United States)

    Keis, Felix; Chwala, Christian; Kunstmann, Harald

    2015-04-01

    Using commercial microwave link networks for precipitation estimation has become popular in the last years. Acquiring the necessary data from the network operators is however still difficult. Usually, data is provided to researches with large temporal delay and at irregular basis. Driven by the demand to facilitate this data accessibility, a custom acquisition software for microwave links has been developed in joint cooperation with our industry partner Ericsson. It is capable of recording data from a great number of microwave links simultaneously and of forwarding the data instantaneously to a newly established KIT-internal database. It makes use of the Simple Network Management Protocol (SNMP) and collects the transmitter and receiver power levels via asynchronous SNMP requests. The software is currently in its first operational test phase, recording data from several hundred Ericsson microwave links in southern Germany. Furthermore the software is used to acquire data with 1 Hz temporal resolution from four microwave links operated by the skiing resort in Garmisch-Partenkirchen. For convenient accessibility of this amount of data we have developed a web frontend for the emerging microwave link database. It provides dynamic real time visualization and basic processing of the recorded transmitter and receiver power levels. Here we will present details of the custom data acquisition software with focus on the design of the KIT microwave link database and on the specifically developed web frontend.

  7. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  8. Software verification and validation for commercial statistical packages utilized by the statistical consulting section of SRTC

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.B.

    2000-03-22

    The purpose of this report is to provide software verification and validation for the statistical packages used by the Statistical Consulting Section (SCS) of the Savannah River Technology Center. The need for this verification and validation stems from the requirements of the Quality Assurance programs that are frequently applicable to the work conducted by SCS. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore the software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are reevaluated using these new tools, this report is to be revised to address their verification and validation.

  9. Progress on standardization and automation in software development on W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg, E-mail: kuehner@ipp.mpg.de [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Krom, Jon; Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer For W7X software development the use of ISO/IEC15504-5 is further extended. Black-Right-Pointing-Pointer The standard provides a basis to manage software multi-projects for a large system project. Black-Right-Pointing-Pointer Adoption of a scrum-like management allows for quick reaction on priority changes. Black-Right-Pointing-Pointer A high degree of software build automation allows for quick responses to user requests. Black-Right-Pointing-Pointer It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: -Continuous build, integration and automated test of software artefacts. Ring-Operator Syntax checks and code quality metrics. Ring-Operator Documentation generation. Ring-Operator Feedback for developers by temporal statistics. -Versioned repository for build products (libraries, executables). -Separate snapshot and release repositories and automatic deployment. -Semi-automatic provisioning of applications. -Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized

  10. Progress on standardization and automation in software development on W7X

    International Nuclear Information System (INIS)

    Kühner, Georg; Bluhm, Torsten; Heimann, Peter; Hennig, Christine; Kroiss, Hugo; Krom, Jon; Laqua, Heike; Lewerentz, Marc; Maier, Josef; Schacht, Jörg; Spring, Anett; Werner, Andreas; Zilker, Manfred

    2012-01-01

    Highlights: ► For W7X software development the use of ISO/IEC15504-5 is further extended. ► The standard provides a basis to manage software multi-projects for a large system project. ► Adoption of a scrum-like management allows for quick reaction on priority changes. ► A high degree of software build automation allows for quick responses to user requests. ► It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: –Continuous build, integration and automated test of software artefacts. ∘Syntax checks and code quality metrics. ∘Documentation generation. ∘Feedback for developers by temporal statistics. –Versioned repository for build products (libraries, executables). –Separate snapshot and release repositories and automatic deployment. –Semi-automatic provisioning of applications. –Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized interfaces providing the capability to construct arbitrary function aggregates for dedicated tests of quality attributes as availability, reliability

  11. Environmental release of engineered nanomaterials from commercial tiles under standardized abrasion conditions.

    Science.gov (United States)

    Bressot, Christophe; Manier, Nicolas; Pagnoux, Cécile; Aguerre-Chariol, Olivier; Morgeneyer, Martin

    2017-01-15

    The study presented here focuses on commercial antibacterial tiles whose emissivity of (nano) particles due to abrasion has yet barely been investigated. The tiles have been characterized regarding their surface properties and composition throughout their chain-of-use, i.e. from their state of commercialization until the experimental end-of-service life. In contrast to plane standard tiles, their surfaces form hilly surfaces. In the depressions, titanium dioxide is found at the surface, thus theoretically protected by the hilly areas against abrasion on the tile's surface. Furthermore, a deposition technique has been put in place by producers allowing for coating the before mentioned commercial tiles with titanium dioxide, thus being similar to those commercially available. It consists in depositing titanium dioxide on the surface, latter one allowing fixing the first. This development allows for better understanding the future options for product formulation and thus improvement with respect to particle release. The tests reveal the aerosolization from commercial antibacterial tiles of micronic and submicronic particles in the inhalable region or particles that can subjected to be released in the environment (tiles was found to be significantly higher compared to the non coated tiles. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. A ternary phase-field model incorporating commercial CALPHAD software and its application to precipitation in superalloys

    International Nuclear Information System (INIS)

    Wen, Y.H.; Lill, J.V.; Chen, S.L.; Simmons, J.P.

    2010-01-01

    A ternary phase-field model was developed that is linked directly to commercial CALPHAD software to provide quantitative thermodynamic driving forces. A recently available diffusion mobility database for ordered phases is also implemented to give a better description of the diffusion behavior in alloys. Because the targeted application of this model is the study of precipitation in Ni-based superalloys, a Ni-Al-Cr model alloy was constructed. A detailed description of this model is given in the paper. We have considered the misfit effects of the partitioning of the two solute elements. Transformation rules of the dual representation of the γ+γ ' microstructure by CALPHAD and by the phase field are established and the link with commercial CALPHAD software is described. Proof-of-concept tests were performed to evaluate the model and the results demonstrate that the model can qualitatively reproduce observed γ ' precipitation behavior. Uphill diffusion of Al is observed in a few diffusion couples, showing the significant influence of Cr on the chemical potential of Al. Possible applications of this model are discussed.

  13. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    OpenAIRE

    Cevdet Kızıl; Ayşe Tansel Çetin; Ahmed Bulunmaz

    2014-01-01

    The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commerc...

  14. Comparison of Standard 90.1-2007 and the 2009 IECC with Respect to Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.; Bartlett, Rosemarie; Halverson, Mark A.

    2009-12-11

    The U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP) has been asked by some states and energy code stakeholders to address the comparability of the 2009 International Energy Conservation Code® (IECC) as applied to commercial buildings and ANSI/ASHRAE/IESNA Standard 90.1-2007 (hereinafter referred to as Standard 90.1-07). An assessment of comparability will help states respond to and implement conditions specified in the State Energy Program (SEP) Formula Grants American Recovery and Reinvestment Act Funding Opportunity, Number DE-FOA-0000052, and eliminate the need for the states individually or collectively to perform comparative studies of the 2009 IECC and Standard 90.1-07. The funding opportunity announcement contains the following conditions: (2) The State, or the applicable units of local government that have authority to adopt building codes, will implement the following: (A) A residential building energy code (or codes) that meets or exceeds the most recent International Energy Conservation Code, or achieves equivalent or greater energy savings. (B) A commercial building energy code (or codes) throughout the State that meets or exceeds the ANSI/ASHRAE/IESNA Standard 90.1-2007, or achieves equivalent or greater energy savings . (C) A plan to achieve 90 percent compliance with the above energy codes within eight years. This plan will include active training and enforcement programs and annual measurement of the rate of compliance. With respect to item (B) above, many more states, regardless of the edition date, directly adopt the IECC than Standard 90.1-07. This is predominately because the IECC is a model code and part of a coordinated set of model building codes that state and local government have historically adopted to regulate building design and construction. This report compares the 2009 IECC to Standard 90.1-07 with the intent of helping states address whether the adoption and application of the 2009 IECC for commercial

  15. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-06-18

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented.

  16. 76 FR 9817 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-02-22

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... obtaining information (29 U.S.C. 657). Subpart T applies to diving and related support operations conducted...

  17. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  18. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    Science.gov (United States)

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  19. R and D issues in structural design standard for commercialized fast reactor components

    International Nuclear Information System (INIS)

    Shibamoto, Hiroshi; Tanaka, Yoshihiko; Inoue, Kazuhiko; Kasahara, Naoto; Morishita, Masaki

    2003-01-01

    Conceptual design studies of Japanese commercialized fast reactors (FRs) are carried out. With careful considerations on safety, economical improvement for practical use of these plants are aimed for, and the design of them is rationalized with adopting simple and innovative components, etc.. To certify the design concepts and validate structural integrity, research and development of Fast Reactor Structural Design Standard (FDS) for commercialized fast reactor components is now under way. Based on general characteristics of FRs and design needs of commercialized FRs, main subjects of R and D were identified. As for failure criteria, development for rational treatment of creep design region is dealt with. Ratcheting fatigue tests are conducted for the purpose of confirming the limit that ratcheting strain has negligible effects on structure strength. Besides, objecting to contribute to design rationalization through prediction of elasto-plastic and creep behaviors with high-precision, efforts to establish inelastic analysis methodologies for design are carried out. A guideline on inelastic analysis for design related to FDS, is prepared. This guideline is applied to evaluate the structural integrity of critical parts of components in conceptual design of commercialized FRs. Furthermore, aimed for mitigating thermal loads, a guideline on thermal loads modeling for design related to FDS is under developing. (author)

  20. An independent monitor unit calculation by commercial software as a part of a radiotherapy treatment planning system quality control

    International Nuclear Information System (INIS)

    Nechvil, K.; Mynarik, J.

    2014-01-01

    For the independent calculation of the monitored unit (MU) the commercial software RadCalc (Lifeline Software Inc., Tyler TX) was used as the choice of some available similar programs. The program was configured and used to verify the doses calculated by commercially accessible planning system Eclipse version 8.6.17 (Varian Medical System Inc., Palo Alto). This system is being used during the clinical running for the creation of the treatment plans. The results of each plan were compared to the dose phantom measurements by the ionization chamber at the same point in which the calculation were done (Eclipse, RadCalc) - in the izocentre. TPS is configured by the beam data (PDD and OAR). Those beam data were exported and afterwards the same data were imported to the program RadCalc. The consistent and independent data between TPS and RadCalc were gained by this process. The reference conditions were set the identical in RadCalc as in TPS, so the consistency between TPS and RadCalc output factors has been achieved (Collimator Scatter Factor: Sc, Phantom Scatter Factor: Sp). Those output factors were also measured by the ionizing chamber in the water phantom and compared with the TPS. Based on the clinical data of the response to the doses, ICRU recommends ensuring the ability of dosimetric systems to deliver the doses with accuracy of at least 5%. Many factors, such as layout of anatomic structures, positioning of a patient, factors related to an accelerator (a dose calibration and mechanic parameters) cause random and systematic failures in a dose delivery. The source of some problems can be also caused by the system databases and relating information transfer; and the TPS containing besides other things other dose calculation algorithms. (authors)

  1. An adaptive software defined radio design based on a standard space telecommunication radio system API

    Science.gov (United States)

    Xiong, Wenhao; Tian, Xin; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2017-05-01

    Software defined radio (SDR) has become a popular tool for the implementation and testing for communications performance. The advantage of the SDR approach includes: a re-configurable design, adaptive response to changing conditions, efficient development, and highly versatile implementation. In order to understand the benefits of SDR, the space telecommunication radio system (STRS) was proposed by NASA Glenn research center (GRC) along with the standard application program interface (API) structure. Each component of the system uses a well-defined API to communicate with other components. The benefit of standard API is to relax the platform limitation of each component for addition options. For example, the waveform generating process can support a field programmable gate array (FPGA), personal computer (PC), or an embedded system. As long as the API defines the requirements, the generated waveform selection will work with the complete system. In this paper, we demonstrate the design and development of adaptive SDR following the STRS and standard API protocol. We introduce step by step the SDR testbed system including the controlling graphic user interface (GUI), database, GNU radio hardware control, and universal software radio peripheral (USRP) tranceiving front end. In addition, a performance evaluation in shown on the effectiveness of the SDR approach for space telecommunication.

  2. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "In......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  3. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rendon, X. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Bosmans, H.; Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Oyen, R. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium)

    2015-07-15

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  4. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    International Nuclear Information System (INIS)

    Lopez-Rendon, X.; Bosmans, H.; Zanca, F.; Oyen, R.

    2015-01-01

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  5. Upgrade and standardization of real-time software for telescope systems at the Gemini telescopes

    Science.gov (United States)

    Rambold, William N.; Gigoux, Pedro; Urrutia, Cristian; Ebbers, Angelic; Taylor, Philip; Rippa, Mathew J.; Rojas, Roberto; Cumming, Tom

    2014-07-01

    The real-time control systems for the Gemini Telescopes were designed and built in the 1990s using state-of-the-art software tools and operating systems of that time. Since these systems are in use every night they have not been kept upto- date and are now obsolete and very labor intensive to support. Gemini is currently engaged in a major upgrade of its telescope control systems. This paper reviews the studies performed to select and develop a new standard operating environment for Gemini real-time systems and the work performed so far in implementing it.

  6. Experience implementing energy standards for commercial buildings and its lessons for the Philippines

    Energy Technology Data Exchange (ETDEWEB)

    Busch, John; Deringer, Joseph

    1998-10-01

    Energy efficiency standards for buildings have been adopted in over forty countries. This policy mechanism is pursued by governments as a means of increasing energy efficiency in the buildings sector, which typically accounts for about a third of most nations' energy consumption and half of their electricity consumption. This study reports on experience with implementation of energy standards for commercial buildings in a number of countries and U.S. states. It is conducted from the perspective of providing useful input to the Government of the Philippines' (GOP) current effort at implementing their building energy standard. While the impetus for this work is technical assistance to the Philippines, the intent is to shed light on the broader issues attending implementation of building energy standards that would be applicable there and elsewhere. The background on the GOP building energy standard is presented, followed by the objectives for the study, the approach used to collect and analyze information about other jurisdictions' implementation experience, results, and conclusions and recommendations.

  7. XML as a standard I/O data format in scientific software development

    International Nuclear Information System (INIS)

    Song Tianming; Yang Jiamin; Yi Rongqing

    2010-01-01

    XML is an open standard data format with strict syntax rules, which is widely used in large-scale software development. It is adopted as I/O file format in the development of SpectroSim, a simulation and data-processing system for soft x-ray spectrometer used in ICF experiments. XML data that describe spectrometer configurations, schema codes that define syntax rules for XML and report generation technique for visualization of XML data are introduced. The characteristics of XML such as the capability to express structured information, self-descriptive feature, automation of visualization are explained with examples, and its feasibility as a standard scientific I/O data file format is discussed. (authors)

  8. Simple method for the determination of rosiglitazone in human plasma using a commercially available internal standard.

    Science.gov (United States)

    Mamidi, Rao N V S; Benjamin, Biju; Ramesh, Mullangi; Srinivas, Nuggehally R

    2003-09-01

    To the best of our knowledge, bioanalytical methods to determine rosiglitazone in human plasma reported in literature use internal standards that are not commercially available. Our purpose was to develop a simple method for the determination of rosiglitazone in plasma employing a commercially available internal standard (IS). After the addition of celecoxib (IS), plasma (0.25 mL) samples were extracted into ethyl acetate. The residue after evaporation of the organic layer was dissolved in 750 microL of mobile phase and 50 microL was injected on to HPLC. The separation was achieved using a Hichrom KR 100, 250 x 4.6 mm C(18) with a mobile phase composition potassium dihydrogen phosphate buffer (0.01 m, pH 6.5):acetonitrile:methanol (40:50:10, v/v/v). The flow-rate of the mobile phase was set at 1 mL/min. The column eluate was monitored by fluorescence detector set at an excitation wavelength of 247 nm and emission wavelength of 367 nm. Linear relationships (r(2) > 0.99) were observed between the peak area ratio rosiglitazone to IS vs rosiglitazone concentrations across the concentration range 5-1000 ng/mL. The intra-run precision (%RSD) and accuracy (%Dev) in the measurement of rosiglitazone were 80% for both rosiglitazone and IS from human plasma. The lower limit of quantitation of the assay was 5 ng/mL. In summary, the methodology for rosiglitazone measurement in plasma was simple, sensitive and employed a commercially available IS. Copyright 2003 John Wiley & Sons, Ltd.

  9. Developing evidence-based prescriptive ventilation rate standards for commercial buildings in California: a proposed framework

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Fisk, William J.

    2014-02-01

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effects associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each

  10. Inside a VAMDC data node—putting standards into practical software

    Science.gov (United States)

    Regandell, Samuel; Marquart, Thomas; Piskunov, Nikolai

    2018-03-01

    Access to molecular and atomic data is critical for many forms of remote sensing analysis across different fields. Many atomic and molecular databases are however highly specialised for their intended application, complicating querying and combination data between sources. The Virtual Atomic and Molecular Data Centre, VAMDC, is an electronic infrastructure that allows each database to register as a ‘node’. Through services such as VAMDC’s portal website, users can then access and query all nodes in a homogenised way. Today all major Atomic and Molecular databases are attached to VAMDC This article describes the software tools we developed to help data providers create and manage a VAMDC node. It gives an overview of the VAMDC infrastructure and of the various standards it uses. The article then discusses the development choices made and how the standards are implemented in practice. It concludes with a full example of implementing a VAMDC node using a real-life case as well as future plans for the node software.

  11. 49 CFR 393.100 - Which types of commercial motor vehicles are subject to the cargo securement standards of this...

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Which types of commercial motor vehicles are... Which types of commercial motor vehicles are subject to the cargo securement standards of this subpart... Section 393.100 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL MOTOR...

  12. Commercial and Industrial Solid Waste Incineration Units (CISWI): New Source Performance Standards (NSPS) and Emission Guidelines (EG) for Existing Sources

    Science.gov (United States)

    Learn about the New Source Performance Standards (NSPS) for commercial and industrial solid waste incineration (CISWI) units including emission guidelines and compliance times for the rule. Read the rule history and summary, and find supporting documents

  13. Evaluations of UltraiQ software for objective ultrasound image quality assessment using images from a commercial scanner.

    Science.gov (United States)

    Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J

    2018-01-16

    We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  14. OSPAR standard method and software for statistical analysis of beach litter data.

    Science.gov (United States)

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Comparison of Commercial Structure-From Photogrammety Software Used for Underwater Three-Dimensional Modeling of Coral Reef Environments

    Science.gov (United States)

    Burns, J. H. R.; Delparte, D.

    2017-02-01

    Structural complexity in ecosystems creates an assortment of microhabitat types and has been shown to support greater diversity and abundance of associated organisms. The 3D structure of an environment also directly affects important ecological parameters such as habitat provisioning and light availability and can therefore strongly influence ecosystem function. Coral reefs are architecturally complex 3D habitats, whose structure is intrinsically linked to the ecosystem biodiversity, productivity, and function. The field of coral ecology has, however, been primarily limited to using 2-dimensional (2D) planar survey techniques for studying the physical structure of reefs. This conventional approach fails to capture or quantify the intricate structural complexity of corals that influences habitat facilitation and biodiversity. A 3-dimensional (3D) approach can obtain accurate measurements of architectural complexity, topography, rugosity, volume, and other structural characteristics that affect biodiversity and abundance of reef organisms. Structurefrom- Motion (SfM) photogrammetry is an emerging computer vision technology that provides a simple and cost-effective method for 3D reconstruction of natural environments. SfM has been used in several studies to investigate the relationship between habitat complexity and ecological processes in coral reef ecosystems. This study compared two commercial SfM software packages, Agisoft Photoscan Pro and Pix4Dmapper Pro 3.1, in order to assess the cpaability and spatial accuracy of these programs for conducting 3D modeling of coral reef habitats at three spatial scales.

  16. Feature Selection for Evolutionary Commercial-off-the-Shelf Software: Studies Focusing on Time-to-Market, Innovation and Hedonic-Utilitarian Trade-Offs

    Science.gov (United States)

    Kakar, Adarsh Kumar

    2013-01-01

    Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…

  17. Assessing the Content and Quality of Commercially Available Reading Software Programs: Do They Have the Fundamental Structures to Promote the Development of Early Reading Skills in Children?

    Science.gov (United States)

    Grant, Amy; Wood, Eileen; Gottardo, Alexandra; Evans, Mary Ann; Phillips, Linda; Savage, Robert

    2012-01-01

    The current study developed a taxonomy of reading skills and compared this taxonomy with skills being trained in 30 commercially available software programs designed to teach emergent literacy or literacy-specific skills for children in preschool, kindergarten, and Grade 1. Outcomes suggest that, although some skills are being trained in a…

  18. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    Science.gov (United States)

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  19. 40 CFR 745.230 - Work practice standards for conducting lead-based paint activities: public and commercial...

    Science.gov (United States)

    2010-07-01

    ... Activities § 745.230 Work practice standards for conducting lead-based paint activities: public and... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Work practice standards for conducting lead-based paint activities: public and commercial buildings, bridges and superstructures. 745.230...

  20. Radiotherapy pre-treatment dose validation: A second verification of monitor units (MU with a commercial software

    Directory of Open Access Journals (Sweden)

    Iqbal Al Amri

    2012-01-01

    Full Text Available Inversely planned intensity-modulated radiotherapy (IMRT and stereotactic small field radiotherapy should be verified before treatment execution. A second verification is carried out for planned treatments in IMRT and 3D conformal radiotherapy (3D-CRT using a monitor verification commercial dose calculation management software (DCMS. For the same reference point the ion-chamber measured doses are compared for IMRT plans. DCMS (Diamond computes dose based on modified Clarkson integration, accounting for multi-leaf collimators (MLC transmission and measured collimator scatter factors. DCMS was validated with treatment planning system (TPS (Eclipse 6.5 Version, Varian, USA separately. Treatment plans computed from TPS are exported to DCMS using DICOM interface. Doses are re-calculated at selected points for fields delivered to IMRT phantom (IBA Scanditronix Wellhofer in high-energy linac (Clinac 2300 CD, Varian. Doses measured at central axis, for the same points using CC13 (0.13 cc ion chamber with Dose 1 Electrometer (Scanditronix Wellhofer are compared with calculated data on DCMS and TPS. The data of 53 IMRT patients with fields ranging from 5 to 9 are reported. The computed dose for selected monitor units (MU by Diamond showed good agreement with planned doses by TPS. DCMS dose prediction matched well in 3D-CRT forward plans (0.8 ± 1.3%, n = 37 and in IMRT inverse plans (−0.1 ± 2.2%, n = 37. Ion chamber measurements agreed well with Eclipse planned doses (−2.1 ± 2.0%, n = 53 and re-calculated DCMS doses (−1.5 ± 2.6%, n = 37 in phantom. DCMS dose validation is in reasonable agreement with TPS. DCMS calculations corroborate well with ionometric measured doses in most of the treatment plans.

  1. Radiotherapy pre-treatment dose validation: A second verification of monitor units (MU) with a commercial software

    Science.gov (United States)

    Al Amri, Iqbal; Ravichandran, Ramamoorthy; Sivakumar, Somangili Satyamoorthi; Binukumar, Johnson Pichi; Davis, Chirayathmanjiyil Antony; Al Rahbi, Zakia; Al Shukeili, Khalsa; Al Kindi, Fatima

    2012-01-01

    Inversely planned intensity-modulated radiotherapy (IMRT) and stereotactic small field radiotherapy should be verified before treatment execution. A second verification is carried out for planned treatments in IMRT and 3D conformal radiotherapy (3D-CRT) using a monitor verification commercial dose calculation management software (DCMS). For the same reference point the ion-chamber measured doses are compared for IMRT plans. DCMS (Diamond) computes dose based on modified Clarkson integration, accounting for multi-leaf collimators (MLC) transmission and measured collimator scatter factors. DCMS was validated with treatment planning system (TPS) (Eclipse 6.5 Version, Varian, USA) separately. Treatment plans computed from TPS are exported to DCMS using DICOM interface. Doses are re-calculated at selected points for fields delivered to IMRT phantom (IBA Scanditronix Wellhofer) in high-energy linac (Clinac 2300 CD, Varian). Doses measured at central axis, for the same points using CC13 (0.13 cc) ion chamber with Dose 1 Electrometer (Scanditronix Wellhofer) are compared with calculated data on DCMS and TPS. The data of 53 IMRT patients with fields ranging from 5 to 9 are reported. The computed dose for selected monitor units (MU) by Diamond showed good agreement with planned doses by TPS. DCMS dose prediction matched well in 3D-CRT forward plans (0.8 ± 1.3%, n = 37) and in IMRT inverse plans (–0.1 ± 2.2%, n = 37). Ion chamber measurements agreed well with Eclipse planned doses (–2.1 ± 2.0%, n = 53) and re-calculated DCMS doses (–1.5 ± 2.6%, n = 37) in phantom. DCMS dose validation is in reasonable agreement with TPS. DCMS calculations corroborate well with ionometric measured doses in most of the treatment plans. PMID:23293456

  2. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  3. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  4. Diagnostic X-Ray dosimeters using standard Float Zone (FZ) and XRA-50 commercial diodes

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Josemary A.C.; Bueno, Carmen C., E-mail: josemary@ipen.br, E-mail: ccbueno@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN-CNEN/SP), São Paulo, SP (Brazil); Barros, Vinicius S.M.; Asfora, Viviane K.; Khoury, Helen J., E-mail: vsmdbarros@gmail.com, E-mail: vikhoury@gmail.com, E-mail: hjkhoury@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Física

    2017-07-01

    The results obtained with a standard float zone (FZ) silicon diode, processed at the Helsinki Institute of Physics, used as on-line diagnostic X-ray dosimeter are described in this work. The device was connected in the short circuit current mode to the input of an integrating electrometer. The response repeatability and the current sensitivity coefficient of the diode were measured with diagnostic X-ray beams in the range of 40-80 kV. The dose-response of the device, evaluated from 10 mGy up to 500 mGy, was linear with high charge sensitivity. Nevertheless, significant energy dependence was observed in the charge sensitivity of FZ device for energies below 70 kV. The dosimetric characteristics of this FZ diode were compared to those of an XRA-50 commercial Si diode, specially designed to X-ray dosimetry. The results obtained with the FZ diode evidenced that it can be an alternative choice for diagnostic X-ray dosimetry, although it needs to be calibrated for individual X-ray beam energies. The studies of long-term stability and the radiation hardness of these diodes are under way. (author)

  5. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  6. Development of a viability standard curve for microencapsulated probiotic bacteria using confocal microscopy and image analysis software.

    Science.gov (United States)

    Moore, Sarah; Kailasapathy, Kasipathy; Phillips, Michael; Jones, Mark R

    2015-07-01

    Microencapsulation is proposed to protect probiotic strains from food processing procedures and to maintain probiotic viability. Little research has described the in situ viability of microencapsulated probiotics. This study successfully developed a real-time viability standard curve for microencapsulated bacteria using confocal microscopy, fluorescent dyes and image analysis software. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. 75 FR 2921 - Commercial Driver's License Standards: Application for Exemption; Volvo Trucks North America (Volvo)

    Science.gov (United States)

    2010-01-19

    ... Exemption; Volvo Trucks North America (Volvo) AGENCY: Federal Motor Carrier Safety Administration (FMCSA... Volvo Trucks North America (Volvo) has applied for an exemption from the Federal requirement for a driver of commercial motor vehicles (CMVs) to hold a commercial driver's license (CDL). Volvo requests...

  8. 75 FR 8181 - Commercial Driver's License Standards: Application for Exemption; Volvo Trucks North America (Volvo)

    Science.gov (United States)

    2010-02-23

    ... Exemption; Volvo Trucks North America (Volvo) AGENCY: Federal Motor Carrier Safety Administration (FMCSA... Volvo Trucks North America (Volvo) has applied for an exemption from the Federal requirement for a driver of commercial motor vehicles (CMVs) to hold a commercial driver's license (CDL). Volvo requests...

  9. 76 FR 38153 - California State Nonroad Engine Pollution Control Standards; Commercial Harbor Craft Regulations...

    Science.gov (United States)

    2011-06-29

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9427-1] California State Nonroad Engine Pollution Control... engines on commercial harbor craft. CARB has requested that EPA issue a new authorization under section... propulsion and auxiliary engines on new and in-use commercial harbor crafts, with some exceptions.\\6...

  10. BioContainers: an open-source and community-driven framework for software standardization.

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  11. Geoscience data standards, software implementations, and the Internet. Where we came from and where we might be going.

    Science.gov (United States)

    Blodgett, D. L.

    2014-12-01

    Geographic information science and the coupled database and software systems that have grown from it have been evolving since the early 1990s. The multi-file shapefile package, invented early in this evolution, is an example of a highly generalized file format that can be used as an archival, interchange, and format for program execution. There are other formats, such as GeoTIFF and NetCDF that have similar characteristics. These de-facto standard (in contrast to the formally defined and published standards) formats, while not initially designed for machine-readable web-services, are used in them extensively. Relying on these formats allows legacy software to be adapted to web-services, but may require complicate software development to handle dynamic introspection of these legacy file formats' metadata. A generalized system of web-service types that offer archive, interchange, and run-time capabilities based on commonly implemented file formats and established web-service specifications has emerged from exemplar implementations. For example, an Open Geospatial Consortium (OGC) Web Feature Service is used to serve sites or model polygons and an OGC Sensor Observation Service provides time series data for the sites. The broad system of data formats, web-service types, and freely available software that implements the system will be described. The presentation will include a perspective on the future of this basic system and how it relates to scientific domain specific information models such as the Open Geospatial Consortium standards for geographic, hydrologic, and hydrogeologic data.

  12. 77 FR 48108 - Energy Conservation Standards for Commercial Clothes Washers: Public Meeting and Availability of...

    Science.gov (United States)

    2012-08-13

    ... commercial matters regulated by U.S. antitrust laws. After the public meeting and the expiration of the.... Code, for editorial reasons.) The Energy Policy Act of 2005 (EPACT 2005), Public Law 109-58, further...

  13. Software Realization on the MSC nanoRISC Hardware Platform, for Communication according to the IEC61850 Standard

    Directory of Open Access Journals (Sweden)

    A. V. Kabović

    2015-06-01

    Full Text Available This paper describes software realization and its implementation for the communication, according to the IEC61850 standard, between the module for monitoring teleprotection devices and the control/monitoring server in a power substation. Teleprotection devices have an important role in the transmission of messages for power line section tripping. The software is implemented on the “MSC nanoRISC-S3C2416 MB2” hardware platform type, which belongs to the COM (computer on module systems.

  14. New AICPA standards aid accounting for the costs of internal-use software.

    Science.gov (United States)

    Luecke, R W; Meeting, D T; Klingshirn, R G

    1999-05-01

    Statement of Position (SOP) No. 98-1, "Accounting for the Costs of Computer Software Developed or Obtained for Internal Use," issued by the American Institute of Certified Public Accountants in March 1998, provides financial managers with guidelines regarding which costs involved in developing or obtaining internal-use software should be expensed and which should be capitalized. The SOP identifies three stages in the development of internal-use software: the preliminary project stage, the application development stage, and the postimplementation-operation stage. The SOP provides that all costs incurred during the preliminary project stage should be expensed as incurred. During the application development stage, costs associated with developing or obtaining the software should be capitalized, while costs associated with preparing data for use within the new system should be expensed. Costs incurred during the postimplementation-operation stage, typically associated with training and application maintenance, should be expensed.

  15. Merging ORS Standards to Facilitate Rapid Development of Reusable Spacecraft Software, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — DNet has been actively pursuing strategies for shortening the software development portion of the satellite development life-cycle for some time. We recognized upon...

  16. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  17. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable.

    Science.gov (United States)

    Nickerson, David; Atalag, Koray; de Bono, Bernard; Geiger, Jörg; Goble, Carole; Hollmann, Susanne; Lonien, Joachim; Müller, Wolfgang; Regierer, Babette; Stanford, Natalie J; Golebiewski, Martin; Hunter, Peter

    2016-04-06

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome.

  18. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  19. Effects of increasing dietary standardized ileal digestible lysine for gilts grown in a commercial finishing environment.

    Science.gov (United States)

    Shelton, N W; Tokach, M D; Dritz, S S; Goodband, R D; Nelssen, J L; DeRouchey, J M

    2011-11-01

    Three experiments were conducted to determine the effects of increasing dietary standardized ileal digestible (SID) Lys on growing and finishing gilts. Diets in all 3 experiments were corn-soybean meal-based and contained 0.15% l-Lys•HCl and 3% added fat from choice white grease. Desired SID Lys concentrations were achieved by altering levels of corn and soybean meal in the diet. Each experiment consisted of 6 treatments with 7 pens per treatment and approximately 27 gilts (PIC 337 × 1050) per pen. In Exp. 1, 1,085 gilts (initially 38.2 kg) were fed diets formulated to contain SID Lys concentrations of 0.7, 0.8, 0.9, 1.0, 1.1, or 1.2% for 28 d, which were analyzed to be total Lys concentrations of 0.78, 0.86, 0.99, 1.06, 1.14, and 1.24%, respectively. As SID Lys increased, ADG and G:F improved (quadratic, P Gilts in this trial required approximately 21.8 g of SID Lys intake per kilogram of BW gain from 38 to 65 kg. In Exp. 2, 1,092 (initially 55.2 kg) gilts were fed diets formulated to contain SID Lys concentrations of 0.66, 0.74, 0.82, 0.90, 0.98, or 1.06% for 28 d, which were analyzed to be total Lys concentrations of 0.75, 0.73, 0.84, 0.90, 0.95, and 0.97%, respectively. Both ADG (quadratic, P = 0.12) and G:F improved (linear, P Gilts in this trial required approximately 19.6 g of SID Lys per kilogram of BW gain from 55 to 80 kg. In Exp. 3, 1,080 gilts (initially 84.1 kg) were fed diets formulated to contain SID Lys concentrations of 0.54, 0.61, 0.68, 0.75, 0.82, or 0.89% for 29 d, which were analyzed to be total Lys concentrations of 0.62, 0.92, 0.79, 0.99, 0.93, and 1.07%, respectively. As the SID Lys concentration increased, ADG and G:F improved (linear, P Gilts in this trial required 23.0 g of SID Lys per kg of BW gain from 85 to 110 kg. The ideal SID Lys:ME ratio was based on the requirement determined by broken-line analysis in Exp. 1, 2, and 3, with the greatest level being tested in Exp. 3. This equation, SID Lys:ME ratio = -0.011 × BW, kg + 3

  20. 77 FR 4881 - Commercial Driver's license (CDL) Standards; Rotel North American Tours, LLC; Application for...

    Science.gov (United States)

    2012-01-31

    .... requirements. German drivers are preferred because they speak the language fluently and perform a variety of... permitting 22 drivers employed by Rotel and possessing German CDLs, to operate commercial motor vehicles in... holders of German CDLs. Rotel asks that the current exemption, due to expire July 30, 2012, be renewed...

  1. Standard Assays Do Not Predict the Efficiency of Commercial Cellulase Preparations Towards Plant Materials

    NARCIS (Netherlands)

    Kabel, Mirjam A.; Maarel, Marc J.E.C. van der; Klip, Gert; Voragen, Alphons G.J.; Schols, Henk A.

    2006-01-01

    Commercial cellulase preparations are potentially effective for processing biomass feedstocks in order to obtain bioethanol. In plant cell walls, cellulose fibrils occur in close association with xylans (monocotyls) or xyloglucans (dicotyls). The enzymatic conversion of cellulose/xylans is a complex

  2. Reproducibility of dynamic contrast-enhanced MRI and dynamic susceptibility contrast MRI in the study of brain gliomas: a comparison of data obtained using different commercial software.

    Science.gov (United States)

    Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta

    2017-04-01

    Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.

  3. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    International Nuclear Information System (INIS)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-01-01

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed

  4. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    Science.gov (United States)

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  5. Microbiological and physicochemical analysis of yateí (Tetragonisca angustula honey for assessing quality standards and commercialization

    Directory of Open Access Journals (Sweden)

    Amada B Pucciarelli

    2014-12-01

    Full Text Available Due to the interest in the production and trading of yateí (Tetragonisca angustula honey in the province of Misiones, Argentina, in this work we assessed microbiological and physicochemical parameters in order to contribute to the elaboration of standards for quality control and promote commercialization. Results showed that yateí honey samples had significantly different microbiological and physicochemical characteristics in comparison to established quality standards for Apis mellifera honey. Thus, we observed that values for pH (3.72, glucose (19.01 g/100 g and fructose (23.74 g/100 g were lower than A. mellifera quality standards, while acidity (79.42 meq/kg, moisture (24%, and mould and yeast count (MY (3.02 log CFU/g were higher. The acid content was correlated with glucose (R²=0.75 and fructose (R²=0.68 content, and also with mould and yeast counts (R²=0.45 to a lesser extent. The incidence of microorganisms in yateí honey samples reached 42.85% and 39% for Clostridium sulfite-reducers and Bacillus spp., respectively. No C. botulinum or B. cereus cells were detected. Enterococcus spp. and Staphylococcus spp. incidence was similar (ca. 7.14%, whereas Escherichia coli and Salmonella spp. were not detected. We conclude that the microbiological and physicochemical properties of yateí honey are different from those of A. mellifera honey; hence, different quality standards could be implemented to promote its commercialization.

  6. An evaluation of the impact of state Renewable Portfolio Standards (RPS) on retail, commercial, and industrial electricity prices

    Science.gov (United States)

    Puram, Rakesh

    The Renewable Portfolio Standard (RPS) has become a popular mechanism for states to promote renewable energy and its popularity has spurred a potential bill within Congress for a nationwide Federal RPS. While RPS benefits have been touted by several groups, it also has detractors. Among the concerns is that RPS standards could raise electricity rates, given that renewable energy is costlier than traditional fossil fuels. The evidence on the impact of RPS on electricity prices is murky at best: Complex models by NREL and USEIA utilize computer programs with several assumptions which make empirical studies difficult and only predict slight increases in electricity rates associated with RPS standards. Recent theoretical models and empirical studies have found price increases, but often fail to comprehensively include several sets of variables, which in fact could confound results. Utilizing a combination of past papers and studies to triangulate variables this study aims to develop both a rigorous fixed effects regression model as well as a theoretical framework to explain the results. This study analyzes state level panel data from 2002 to 2008 to analyze the effect of RPS on residential, commercial, and industrial electricity prices, controlling for several factors including amount of electricity generation from renewable and non-renewable sources, customer incentives for renewable energy, macroeconomic and demographic indicators, and fuel price mix. The study contrasts several regressions to illustrate important relationships and how inclusions as well as exclusion of various variables have an effect on electricity rates. Regression results indicate that the presence of RPS within a state increases the commercial and residential electricity rates, but have no discernable effect on the industrial electricity rate. Although RPS tends to increase electricity prices, the effect has a small impact on higher electricity prices. The models also indicate that jointly all

  7. Using a commercial mathematics software package for on-line analysis at the BNL Accelerator Test Facility

    International Nuclear Information System (INIS)

    Malone, R.; Wang, X.J.

    1999-01-01

    BY WRITING BOTH A CUSTOM WINDOWS(NTTM) DYNAMIC LINK LIBRARY AND GENERIC COMPANION SERVER SOFTWARE, THE INTRINSIC FUNCTIONS OF MATHSOFT MATHCAD(TM) HAVE BEEN EXTENDED WITH NEW CAPABILITIES WHICH PERMIT DIRECT ACCESS TO THE CONTROL SYSTEM DATABASES OF BROOKHAVEN NATIONAL LABORATORY ACCELERATOR TEST FACILITY. UNDER THIS SCHEME, A MATHCAD WORKSHEET EXECUTING ON A PERSONAL COMPUTER BECOMES A CLIENT WHICH CAN BOTH IMPORT AND EXPORT DATA TO A CONTROL SYSTEM SERVER VIA A NETWORK STREAM SOCKET CONNECTION. THE RESULT IS AN ALTERNATIVE, MATHEMATICALLY ORIENTED VIEW OF CONTROLLING THE ACCELERATOR INTERACTIVELY

  8. Quantitative comparison and evaluation of two commercially available, two-dimensional electrophoresis image analysis software packages, Z3 and Melanie.

    Science.gov (United States)

    Raman, Babu; Cheung, Agnes; Marten, Mark R

    2002-07-01

    While a variety of software packages are available for analyzing two-dimensional electrophoresis (2-DE) gel images, no comparisons between these packages have been published, making it difficult for end users to determine which package would best meet their needs. The goal here was to develop a set of tests to quantitatively evaluate and then compare two software packages, Melanie 3.0 and Z3, in three of the fundamental steps involved in 2-DE image analysis: (i) spot detection, (ii) gel matching, and (iii) spot quantitation. To test spot detection capability, automatically detected protein spots were compared to manually counted, "real" protein spots. Spot matching efficiency was determined by comparing distorted (both geometrically and nongeometrically) gel images with undistorted original images, and quantitation tests were performed on artificial gels with spots of varying Gaussian volumes. In spot detection tests, Z3 performed better than Melanie 3.0 and required minimal user intervention to detect approximately 89% of the actual protein spots and relatively few extraneous spots. Results from gel matching tests depended on the type of image distortion used. For geometric distortions, Z3 performed better than Melanie 3.0, matching 99% of the spots, even for extreme distortions. For nongeometrical distortions, both Z3 and Melanie 3.0 required user intervention and performed comparably, matching 95% of the spots. In spot quantitation tests, both Z3 and Melanie 3.0 predicted spot volumes relatively well for spot ratios less than 1:6. For higher ratios, Melanie 3.0 did much better. In summary, results suggest Z3 requires less user intervention than Melanie 3.0, thus simplifying differential comparison of 2-DE gel images. Melanie 3.0, however, offers many more optional tools for image editing, spot detection, data reporting and statistical analysis than Z3. All image files used for these tests and updated information on the software are available on the internet

  9. Development, analysis, and evaluation of a commercial software framework for the study of Extremely Low Probability of Rupture (xLPR) events at nuclear power plants.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinich, Donald A.; Helton, Jon Craig; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-12-01

    Sandia National Laboratories (SNL) participated in a Pilot Study to examine the process and requirements to create a software system to assess the extremely low probability of pipe rupture (xLPR) in nuclear power plants. This project was tasked to develop a prototype xLPR model leveraging existing fracture mechanics models and codes coupled with a commercial software framework to determine the framework, model, and architecture requirements appropriate for building a modular-based code. The xLPR pilot study was conducted to demonstrate the feasibility of the proposed developmental process and framework for a probabilistic code to address degradation mechanisms in piping system safety assessments. The pilot study includes a demonstration problem to assess the probability of rupture of DM pressurizer surge nozzle welds degraded by primary water stress-corrosion cracking (PWSCC). The pilot study was designed to define and develop the framework and model; then construct a prototype software system based on the proposed model. The second phase of the project will be a longer term program and code development effort focusing on the generic, primary piping integrity issues (xLPR code). The results and recommendations presented in this report will be used to help the U.S. Nuclear Regulatory Commission (NRC) define the requirements for the longer term program.

  10. The series production in a standardized fabrication line for silicide fuels and commercial aspects

    International Nuclear Information System (INIS)

    Wehner, E.L.; Hassel, H.W.

    1987-01-01

    NUKEM has been responsible for the development and fabrication of LEU fuel elements for MTR reactors under the frame of the German AF program since 1979. The AF program is part of the international RERTR efforts, which were initiated by the INFCE Group in 1978. This paper describes the actual status of development and the transition from the prototype to the series production in a standardized manufacturing line for silicide fuels at NUKEM. Technical provisions and a customer oriented standardized product range aim at an economized manufacturing. (Author)

  11. Software Licensing

    OpenAIRE

    Nygrýnová, Dominika

    2014-01-01

    Summary: Software Licensing The thesis deals with different practical aspects of commercial software licensing from the perspective of the Czech legal system. The focus is put on software license agreement as the most important legal instrument granting rights of use for computer programs. The thesis opens with a summary of Czech legislation in force in this area in the context of European community law and international law. The legislation in effect is largely governed by the Copyright Act....

  12. 78 FR 55889 - Energy Conservation Program: Energy Conservation Standards for Commercial Refrigeration Equipment

    Science.gov (United States)

    2013-09-11

    .... National Impact Analysis--National Energy Savings and Net Present Value 1. Shipments a. VOP.RC.L Shipments... Standards Cases 3. National Energy Savings 4. Net Present Value of Customer Benefit 5. Benefits From Effects... Regulatory Burden 3. National Impact Analysis a. Amount and Significance of Energy Savings b. Net Present...

  13. 77 FR 43015 - Energy Conservation Standards for Commercial and Industrial Electric Motors: Public Meeting and...

    Science.gov (United States)

    2012-07-23

    ... period (PBP), and (5) national impact analysis (NIA). The preliminary TSD presents the methodology and... that reflects the real consumer cost of capital and describes the LCC in present-value terms. The PBP... present value (NPV) of total customer costs and savings expected to result from new standards at specific...

  14. 76 FR 77521 - California State Nonroad Engine Pollution Control Standards; Commercial Harbor Craft Regulations...

    Science.gov (United States)

    2011-12-13

    ... inconsistent certification requirements. C. Burden of Proof In Motor and Equip. Mfrs Assoc. v. EPA, 627 F.2d... schedule, EPA cannot change an aspect of California's regulation. EPA is only authorized to review California's standards to determine compliance with section 209. It is not authorized to change California's...

  15. 76 FR 15553 - National Emission Standards for Hazardous Air Pollutants for Area Sources: Industrial, Commercial...

    Science.gov (United States)

    2011-03-21

    ... final emission standards for control of mercury and polycyclic organic matter emissions from coal-fired... the economic impacts? D. What are the benefits? E. What are the water and solid waste impacts? F. What... services and drinking places. 62 Health care and social assistance. \\1\\ North American Industry...

  16. 78 FR 7487 - National Emission Standards for Hazardous Air Pollutants for Area Sources: Industrial, Commercial...

    Science.gov (United States)

    2013-02-01

    ... technologies HAP hazardous air pollutants Hg mercury HQ Headquarters ISO International Standards Organization... services and drinking places. 62 Health care and social assistance. 22111 Electric power generation. \\a...: Residential boiler means a boiler used to provide heat and/or hot water and/or as part of a residential...

  17. Software interface for high-speed readout of particle detectors based on the CoaXPress communication standard

    Science.gov (United States)

    Hejtmánek, M.; Neue, G.; Voleš, P.

    2015-06-01

    This article is devoted to the software design and development of a high-speed readout application used for interfacing particle detectors via the CoaXPress communication standard. The CoaXPress provides an asymmetric high-speed serial connection over a single coaxial cable. It uses a widely available 75 Ω BNC standard and can operate in various modes with a data throughput ranging from 1.25 Gbps up to 25 Gbps. Moreover, it supports a low speed uplink with a fixed bit rate of 20.833 Mbps, which can be used to control and upload configuration data to the particle detector. The CoaXPress interface is an upcoming standard in medical imaging, therefore its usage promises long-term compatibility and versatility. This work presents an example of how to develop DAQ system for a pixel detector. For this purpose, a flexible DAQ card was developed using the XILINX Spartan 6 FPGA. The DAQ card is connected to the framegrabber FireBird CXP6 Quad, which is plugged in the PCI Express bus of the standard PC. The data transmission was performed between the FPGA and framegrabber card via the standard coaxial cable in communication mode with a bit rate of 3.125 Gbps. Using the Medipix2 Quad pixel detector, the framerate of 100 fps was achieved. The front-end application makes use of the FireBird framegrabber software development kit and is suitable for data acquisition as well as control of the detector through the registers implemented in the FPGA.

  18. Software interface for high-speed readout of particle detectors based on the CoaXPress communication standard

    International Nuclear Information System (INIS)

    Hejtmánek, M.; Neue, G.; Voleš, P.

    2015-01-01

    This article is devoted to the software design and development of a high-speed readout application used for interfacing particle detectors via the CoaXPress communication standard. The CoaXPress provides an asymmetric high-speed serial connection over a single coaxial cable. It uses a widely available 75 Ω BNC standard and can operate in various modes with a data throughput ranging from 1.25 Gbps up to 25 Gbps. Moreover, it supports a low speed uplink with a fixed bit rate of 20.833 Mbps, which can be used to control and upload configuration data to the particle detector. The CoaXPress interface is an upcoming standard in medical imaging, therefore its usage promises long-term compatibility and versatility. This work presents an example of how to develop DAQ system for a pixel detector. For this purpose, a flexible DAQ card was developed using the XILINX Spartan 6 FPGA. The DAQ card is connected to the framegrabber FireBird CXP6 Quad, which is plugged in the PCI Express bus of the standard PC. The data transmission was performed between the FPGA and framegrabber card via the standard coaxial cable in communication mode with a bit rate of 3.125 Gbps. Using the Medipix2 Quad pixel detector, the framerate of 100 fps was achieved. The front-end application makes use of the FireBird framegrabber software development kit and is suitable for data acquisition as well as control of the detector through the registers implemented in the FPGA

  19. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    Directory of Open Access Journals (Sweden)

    La Macchia Mariangela

    2012-09-01

    Full Text Available Abstract Purpose To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Methods and materials Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT images, one replanning CT (rCT image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs. We used three software solutions (VelocityAI 2.6.2 (V, MIM 5.1.1 (M by MIMVista and ABAS 2.0 (A by CMS-Elekta to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC were successively corrected manually. We recorded the time needed for: 1 ex novo ROIs definition on rCT; 2 generation of AC by the three software solutions; 3 manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE, sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z from the isocenter. Results The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate, A and M (contours for H&N, and M (contours for mesothelioma. Conclusions From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  20. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer.

    Science.gov (United States)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-09-18

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  1. A brain-computer interface as input channel for a standard assistive technology software.

    Science.gov (United States)

    Zickler, Claudia; Riccio, Angela; Leotta, Francesco; Hillian-Tress, Sandra; Halder, Sebastian; Holz, Elisa; Staiger-Sälzer, Pit; Hoogerwerf, Evert-Jan; Desideri, Lorenzo; Mattia, Donatella; Kübler, Andrea

    2011-10-01

    Recently brain-computer interface (BCI) control was integrated into the commercial assistive technology product QualiWORLD (QualiLife Inc., Paradiso-Lugano, CH). Usability of the first prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate and subjective workload/NASA Task Load Index) and user satisfaction (Quebec User Evaluation of Satisfaction with assistive Technology, QUEST 2.0) by four end-users with severe disabilities. Three assistive technology experts evaluated the device from a third person perspective. The results revealed high performance levels in communication and internet tasks. Users and assistive technology experts were quite satisfied with the device. However, none could imagine using the device in daily life without improvements. Main obstacles were the EEG-cap and low speed.

  2. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    Directory of Open Access Journals (Sweden)

    Schiphorst R

    2005-01-01

    Full Text Available We present our contribution to the general-purpose-processor-(GPP-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in real time. The testbed consists of a transmitter PC with a DAC board and a receiver PC with an ADC board. In our project, we have implemented two different types of standards on this testbed, a continuous-phase-modulation-based standard, Bluetooth, and an OFDM-based standard, HiperLAN/2. However, our testbed can easily be extended to other standards, because the only limitation in our testbed is the maximal channel bandwidth of 20 MHz and of course the processing capabilities of the used PC. The transmitter functions require at most 714 M cycles per second and the receiver functions need 1225 M cycles per second on a Pentium 4 processor. In addition, baseband experiments have been carried out successfully.

  3. Classical table services in commercial catering: standardization proposal and clarifications for future researches

    Directory of Open Access Journals (Sweden)

    Rodolfo Wendhausen Krause

    2016-08-01

    Full Text Available This study aims to synthesize the scientific knowledge with the empirical knowledge of the authors of this article on the four main types/styles of individual services in gastronomic full service establishments. In addition, it seeks to, as secondary objectives, to simplify and standardize the types of classic services in restaurants. These objectives were met through a positivist methodological approach. It had as research techniques a comparative analysis and synthesis of the state of the art on the typology of classical services with the empirical knowledge of the authors. Subsequently, the validation of standardization proposal was made by a panel of evaluators. We came to simplify the services into three basic categories: French Service; Direct English Service and Platted Service. It is understood that, because it is an exploratory study, the proposal is the beginning of scientific research on the subject. Therefore, it has to be investigated in greater depth in future studies. Therefore, the research field of the mise en place is the area that greater needs research of this nature.

  4. Software Metrics and Software Metrology

    CERN Document Server

    Abran, Alain

    2010-01-01

    Most of the software measures currently proposed to the industry bring few real benefits to either software managers or developers. This book looks at the classical metrology concepts from science and engineering, using them as criteria to propose an approach to analyze the design of current software measures and then design new software measures (illustrated with the design of a software measure that has been adopted as an ISO measurement standard). The book includes several case studies analyzing strengths and weaknesses of some of the software measures most often quoted. It is meant for sof

  5. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    Science.gov (United States)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the

  6. Summary of Public Comments and Responses for Industrial, Commercial, and Institutional Boilers and Process Heaters National Emission Standards for Hazardous Air Pollutants (NESHAP) for Major Sources

    Science.gov (United States)

    This page has a 12/2012 document that provides EPA’s responses to public comments on EPA’s Proposed National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers and Process Heaters

  7. Calculation of residence times and radiation doses using the standard PC software Excel

    International Nuclear Information System (INIS)

    Herzog, H.; Zilken, H.; Niederbremer, A.; Friedrich, W.; Mueller-Gaertner, H.W.

    1997-01-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%±18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%±6%. Both outcomes indicate the validity of the present approach. (orig.)

  8. Calculation of residence times and radiation doses using the standard PC software Excel.

    Science.gov (United States)

    Herzog, H; Zilken, H; Niederbremer, A; Friedrich, W; Müller-Gärtner, H W

    1997-12-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%+/-18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%+/-6%. Both outcomes indicate the validity of the present approach.

  9. Calculation of residence times and radiation doses using the standard PC software Excel

    Energy Technology Data Exchange (ETDEWEB)

    Herzog, H.; Zilken, H.; Niederbremer, A.; Friedrich, W. [Institute of Medicine, Research Center Juelich, Juelich (Germany); Mueller-Gaertner, H.W. [Institute of Medicine, Research Center Juelich, Juelich (Germany)]|[Department of Nuclear Medicine, Heinrich-Heine University Hospital Duesseldorf (Germany)

    1997-12-01

    We developed a program which aims to facilitate the calculation of radiation doses to single organs and the whole body. IMEDOSE uses Excel to include calculations, graphical displays, and interactions with the user in a single general-purpose PC software tool. To start the procedure the input data are copied into a spreadsheet. They must represent percentage uptake values of several organs derived from measurements in animals or humans. To extrapolate these data up to seven half-lives of the radionuclide, fitting to one or two exponentional functions is included and can be checked by the user. By means of the approximate time-activity information the cumulated activity or residence times are calculated. Finally these data are combined with the absorbed fraction doses (S-values) given by MIRD pamphlet No. 11 to yield radiation doses, the effective dose equivalent and the effective dose. These results are presented in a final table. Interactions are realized with push-buttons and drop-down menus. Calculations use the Visual Basic tool of Excel. In order to test our program, biodistribution data of fluorine-18 fluorodeoxyglucose were taken from the literature (Meija et al., J Nucl Med 1991; 32:699-706). For a 70-kg adult the resulting radiation doses of all target organs listed in MIRD 11 were different from the ICRP 53 values by 1%{+-}18% on the average. When the residence times were introduced into MIRDOSE3 (Stabin, J Nucl Med 1996; 37:538-546) the mean difference between our results and those of MIRDOSE3 was -3%{+-}6%. Both outcomes indicate the validity of the present approach. (orig.) With 5 figs., 2 tabs., 18 refs.

  10. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  11. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  12. The ALICE DATE the benefits of using hardware and software industry standards in a real-time environment

    CERN Document Server

    Beker, H; Divià, R; Ganz, M; Kocper, B; Tomasicchio, G; Van de Vyvre, P; Vascotto, Alessandro

    1997-01-01

    In 1996, the ALICE DAQ group was confronted to future demands which Abstract:were exceeding the capabilities of the system used up to then. After Abstract:some review of the market, we decided to integrate a newAbstract:data-acquisition system using several different new components.Abstract:In six months, a system has been developed using new VME boards,Abstract:a new version of Unix, a new switching netowrk and new I/O interfaces. Abstract:Despite the very short deadline, the integration of this system Abstract:has been extremely effective and has been ready on time and usedAbstract:during the tests of fall '97. One of the reasons of this success Abstract:was the use of industrially supported hardware and software standards.Abstract:The general architecture of the system will be described togetherAbstract:with the different input/output devices used in the system Abstract:(Fast Ethernet, PCI, FDDI, Fast Wide SCSI) and the corresponding Abstract:performances. The software environment, tools and languages (Uni...

  13. High-performance liquid chromatographic analysis of cyclosporin A in rat blood and liver using a commercially available internal standard.

    Science.gov (United States)

    Chimalakonda, Anjaneya P; Shah, Rakhi B; Mehvar, Reza

    2002-05-25

    All the available HPLC assays of cyclosporin A (CyA) use internal standards that are not commercially available. Our purpose was to develop an HPLC assay for measurements of CyA in rat blood and liver using a commercially available internal standard (I.S.). After the addition of tamoxifen (I.S.), blood (0.25 ml) or the liver homogenate (1 ml) samples were extracted into a mixture of ether:methanol (95:5). The residue after evaporation of the organic layer was dissolved in 200 microl of an injection solution and washed with 1 ml of hexane before analysis. The separation was achieved using an LC-1 column (70 degrees C) with a mobile phase of methanol-acetonitrile-0.01 M KH(2)PO(4) (50:25:25, v/v) and a flow-rate of 1 ml/min. Detection was at 205 nm. Cyclosporin A and I.S. eluted at 5 and 7 min, respectively, free from endogenous peaks. Linear relationships (r>0.98) were observed between the CyA:I.S. peak area ratios and the CyA concentrations within the range of 0.2-10 microg/ml for blood and 0.1-4 microg/ml for the liver homogenates. The intra- and inter-run C.V.s and errors for both the blood and liver samples were <15%. The extraction efficiency (n=5) was close to 100% for both CyA and I.S. in both blood and liver homogenates. The lower limit of quantitation of the assay was 0.2 or 0.1 microg/ml based on 250 microl of blood or 1 ml of liver homogenate, respectively. The assay was capable of measuring blood and liver concentrations of CyA in a rat injected intravenously with a single 5-mg/kg dose of the drug.

  14. Thermal analysis of a solar collector with a standard approach and software used to study windows

    Energy Technology Data Exchange (ETDEWEB)

    Simko, T.; Harrison, S.J. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2007-07-01

    A method of calculating the overall heat loss coefficient of a solar collector was presented. The method was based on a standard approach used to obtain total window U-values. A model of the solar collector was developed with a finite element analysis (FEA) program. Heat loss from the solar collector was represented as the gross collector area; the overall heat loss coefficient; and the difference between the assumed mean absorber plate temperature and ambient temperature. The edge heat loss coefficient was approximated by assuming that there was a 1-D sideways heat flow through the edge area of the collector. Regional heat loss coefficients obtained with the model were then used to calculate the overall heat loss coefficient. Equations used for parallel tube type collectors were applied to the serpentine tube collector. The sightline of the solar collector was defined as being the position along the top cover below absorber plate. The same definitions for the extents of the frame, edge and center-of-glass regions for a window were applied to the collector. Multiple U-values were defined to account for heat flows outward across the top, bottom, and side surfaces of the collector. The absorber plate was simulated as being isothermal. Results were then compared with an experimental study in order to validate the method. The method was also compared with results obtained from a conventional analysis for estimating heat loss coefficients. It was concluded that the new method provided more accurate results than those obtained using the conventional method. 16 refs., 1 tab., 5 figs.

  15. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  16. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  17. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  18. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  19. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  20. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  1. An operational approach to standard nuclear process model (SNPM) and SAP nuclear software implementation at Slovenske Elektrarne

    International Nuclear Information System (INIS)

    Warren, C.C.

    2010-01-01

    Benchmarking efforts in the fall of 2006 showed significant performance gaps in multiple measured processes between the Slovenske Elektrarne (SE) nuclear organization and the highest performing nuclear organizations in the world. While overall performance of the SE nuclear fleet was good and in the second quartile, when compared to the worldwide population of Pressurized Water Reactors (PWR), SE leadership set new goals to improve safety and operational performance to the first decile of the worldwide PWR Fleet. To meet these goals the SE nuclear team initiated a project to identify and implement the Best Practice nuclear processes in multiple areas. The benchmarking process identified the Standard Nuclear Performance Model (SNPM), used in the US nuclear fleet, as the industry best practice process model. The Slovenske Elektrarne nuclear management team used various change management techniques to clearly establish the case for organizational and process change within the nuclear organization. The project organization established by the SE nuclear management team relied heavily on functional line organization personnel to gain early acceptance of the project goals and methods thereby reducing organizational opposition to the significant organizational and process changes. The choice of a standardized process model used, all or in part, by approximately one third of the nuclear industry worldwide greatly facilitated the development and acceptance of the changes. Use of a nuclear proven templated software platform significantly reduced development and testing efforts for the resulting fully integrated solution. In the spring of 2007 SE set in motion a set of initiatives that has resulted in a significant redesign of most processes related to nuclear plant maintenance and continuous improvement. Significant organizational structure changes have been designed and implemented to align the organization to the SNPM processes and programs. The completion of the initial

  2. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  3. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    Science.gov (United States)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  4. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    International Nuclear Information System (INIS)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-01-01

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  5. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Illinois Jurisdictions

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.; Friedrich, Michele

    2002-05-01

    ASHRAE Standard 90.1-1999 was developed in an effort to set minimum requirements for energy efficienty design and construction of new commercial buildings. This report assesses the benefits and costs of adopting this standard as the building energy code in Illinois. Energy and economic impacts are estimated using BLAST combined with a Life-Cycle Cost approach to assess corresponding economic costs and benefits.

  6. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  7. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-2001 as the Commercial Building Energy Code in Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Winiarski, David W.; Belzer, David B.; Richman, Eric E.

    2004-09-30

    ASHRAE Standard 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings (hereafter referred to as ASHRAE 90.1-2001 or 90.1-2001) was developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The State of Tennessee is considering adopting ASHRAE 90.1-2001 as its commercial building energy code. In an effort to evaluate whether or not this is an appropriate code for the state, the potential benefits and costs of adopting this standard are considered in this report. Both qualitative and quantitative benefits and costs are assessed. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST) simulations combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits. Tennessee currently has ASHRAE Standard 90A-1980 as the statewide voluntary/recommended commercial energy standard; however, it is up to the local jurisdiction to adopt this code. Because 90A-1980 is the recommended standard, many of the requirements of ASHRAE 90A-1980 were used as a baseline for simulations.

  8. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  9. Validation of quality control tests of a multi leaf collimator using electronic portal image devices and commercial software; Validacion de unas pruebas de control de calidad del colimador multilamina utilizando dispositivos electronicos de imagen portal y una aplicacion comercial

    Energy Technology Data Exchange (ETDEWEB)

    Latorre-Musoll, A.; Jornet Sala, N.; Carrasco de Fez, P.; Edualdo Puell, T.; Ruiz Martinez, A.; Ribas Morales, M.

    2013-07-01

    We describe a daily quality control procedure of the multi leaf collimator (MLC) based on electronic portal image devices and commercial software. We designed tests that compare portal images of a set of static and dynamic MLC configurations to a set of reference images using commercial portal dosimetry software. Reference images were acquired using the same set of MLC configurations after the calibration of the MLC. To assess the sensitivity to detect MLC under performances, we modified the MLC configurations by inserting a range of leaf position and speed errors. Distance measurements on portal images correlated with leaf position errors down to 0.1 mm in static MLC configurations. Dose differences between portal images correlated both with speed errors down to 0.5% of the nominal leaf velocities and with leaf position errors down to 0.1 mm in dynamic MLC configurations. The proposed quality control procedure can assess static and dynamic MLC configurations with high sensitivity and reliability. (Author)

  10. SU-G-JeP2-06: Dosimetric and Workflow Evaluation of First Commercial Synthetic CT Software for Clinical Use in Pelvis

    Energy Technology Data Exchange (ETDEWEB)

    Tyagi, N; Zhang, J; Happersett, L; Kadbi, M; Mechalakos, J; Deasy, J; Hunt, M [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2016-06-15

    Purpose: evaluate a commercial synthetic CT (syn-CT) software for use in prostate radiotherapy Methods: Twenty prostate patients underwent CT and MR simulation scans in treatment position on a 3T Philips scanner. The MR protocol consisted of a T2w turbo spin-echo for soft tissue contrast, a 2D balanced-fast field echo (b-FFE) for fiducial identification, a dual-echo 3D FFE B0 map for distortion analysis and a 3D mDIXON FFE sequence to generate syn-CT. Two echoes are acquired during mDIXON scan, allowing water, fat, and in-phase images to be derived using the frequency shift of the fat and water protons. Tissues were classified as: air, adipose, water, trabecular/spongy bone and compact/cortical bone and assigned specific bulk HU values. Bone structures are segmented based on a pelvis bone atlas. Accuracy of syn-CT for patient treatment planning was analyzed by transferring the original plan and structures from the CT to syn-CT via rigid registration and recalculating dose. In addition, new IMRT plans were generated on the syn-CT using structures contoured on MR and transferred to the syn-CT. Accuracy of fiducial-based localization at the treatment machine performed using syn-CT or DRRs generated from syn-CT was assessed by comparing to orthogonal kV radiographs or CBCT. Results: Dosimetric comparison between CT and syn-CT was within 0.5% for all structures. The de-novo optimized plans generated on the syn-CT met our institutional clinical objectives for target and normal structures. Patient-induced susceptibility distortion based on B0 maps was within 1mm and 0.4 mm in the body and prostate. The rectal and bladder outlines on the syn-CT were deemed sufficient for assessing rectal and bladder filling on the CBCT at the time of treatment. CBCT localization showed a median error of < ±1 mm in LR, AP and SI direction. Conclusion: MRI derived syn-CT can be used clinically in MR-alone planning and treatment process for prostate. Drs. Deasy, Hunt and Tyagi have Master

  11. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly ratio...... out the new field of software innovation. It organizes the existing scientific research into eight simple heuristics - guiding principles for organizing a system developer's work-life so that it focuses on innovation.......  Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...... rationalistic ways of thinking which stifle the ability to innovate. Professional software developers are often drowned in commercial drudgery and overwhelmed by work pressure and deadlines. The topic that will both ensure success in the market and revitalize their work lives is never addressed. This book sets...

  12. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  13. Comparison of a commercial blood cross-matching kit to the standard laboratory method for establishing blood transfusion compatibility in dogs.

    Science.gov (United States)

    Guzman, Leo Roa; Streeter, Elizabeth; Malandra, Allison

    2016-01-01

    To evaluate the accuracy of a commercial blood transfusion cross-match kit when compared to the standard laboratory method for establishing blood transfusion compatibility. A prospective observational in intro study performed from July 2009 to July 2013. Private referral veterinary center. Ten healthy dogs, 11 anemic dogs, and 24 previously transfused dogs. None. Forty-five dogs were enrolled in a prospective study in order to compare the standard blood transfusion cross-match technique to a commercial blood transfusion cross-matching kit. These dogs were divided into 3 different groups that included 10 healthy dogs (control group), 11 anemic dogs in need of a blood transfusion, and 24 sick dogs that were previously transfused. Thirty-five dogs diagnosed with anemia secondary to multiple disease processes were cross-matched using both techniques. All dogs cross-matched via the kit had a compatible major and minor result, whereas 16 dogs out of 45 (35%) had an incompatible cross-match result when the standard laboratory technique was performed. The average time to perform the commercial kit was 15 minutes and this was 3 times shorter than the manual cross-match laboratory technique that averaged 45-50 minutes to complete. While the gel-based cross-match kit is quicker and less technically demanding than standard laboratory cross-match procedures, microagglutination and low-grade hemolysis are difficult to identify by using the gel-based kits. This could result in transfusion reactions if the gel-based kits are used as the sole determinant of blood compatibility prior to transfusion. Based on our results, the standard manual cross-match technique remains the gold standard test to determine blood transfusion compatibility. © Veterinary Emergency and Critical Care Society 2016.

  14. Commercial Law Reform in territories subject to International Administration. Kosovo & Iraq. Different standards of legitimacy and accountability?

    Directory of Open Access Journals (Sweden)

    Alejandro Carballo Leyda

    2008-01-01

    Full Text Available The paper will address questions of legality and accountability of the legislative functions exerted by international territorial administrations1 in the field of commercial law in two recent scenarios that are theoretically different: a UN-authorized mission under Chapter VII of the UN Chart and that of a strictly Occupying Power. No attempt will be made to study other important and interrelated issues, such as the problematic privatizations carried out in Kosovo and Iraq, which do not seem to be compatible with the obligation of administration of public assets (Art. 55 of the 1907 Hague Regulations.This paper will first provide a brief overview of the deep economic legislative reformation that took place in Iraq and Kosovo during the very early stages. Most of the scholar literature focused on criminal law and human rights aspects, leaving aside commercial law reforms; yet, those profound commercial reforms have resulted in a drastic economic transformation from a planned, centrally controlled, socialist system into a liberal, marketoriented, capitalist economy. The radical nature of those changes raises the question of their conformity with relevant international law and the need for public accountability.Part III will then explore the sources of legality invoked so far (namely UN Mandates, International Humanitarian Law, and authority invested by local intervention by the academic world, experts and intervening actors as basis for the commercial reformation in Kosovo and Iraq, and whether the actual results comply with the discretion vested in the temporal administrations by those sources. Finally, in Part IV problems of judicial review and public accountability in relation to the law-making function of those international administrations in Iraq and Kosovo will be considered.

  15. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  16. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  17. Implementation and Testing of the JANUS Standard with SSC Pacific’s Software-Defined Acoustic Modem

    Science.gov (United States)

    2017-12-01

    ADMINISTRATIVE INFORMATION The work described in this report was performed for the Office of Naval Research (ONR) Forward Deployed Energy and...standard in April 2017 under Standardization Agreement (STANAG) 4748 to define a set of undersea acoustic processes , terms, and conditions for common...embedded processor platform. Transducer Power Amplifier Receiver (3- stage ) Digital-to-Analog Converter (DAC) Analog-to-Digital Converter (ADC) Digital

  18. Bit-level differential power analysis attack on implementations of advanced encryption standard software running inside a PIC18F2420 microcontroller

    CSIR Research Space (South Africa)

    Mpalane, K

    2015-12-01

    Full Text Available Analysis Attack on implementations of Advanced Encryption Standard software running inside a PIC18F2420 Microcontroller Kealeboga Mpalane∗ Department of Computer Science North West University Private Bag X2046,Mafikeng 2745, South Africa Email: kmpalane...@csir.co.za H.D Tsague Council For Scientific and Industrial Research P.O Box 395,Pretoria 0001, South Africa Email: HDjononTsague@csir.co.za Naison Gasela and E.M Esiefa Department of Computer Science North West University Private Bag X2046,Mafikeng 2745, South...

  19. Angular dependence of TL and OSL responses of Al{sub 2}O{sub 3}:C commercial detectors in standard beta radiation beams

    Energy Technology Data Exchange (ETDEWEB)

    Antonio, Patricia L.; Caldas, Linda V.E., E-mail: patrilan@ipen.br, E-mail: lcaldas@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2014-07-01

    The luminescent response of radiation detectors was evaluated by means of the thermoluminescence (TL) and optically stimulated luminescence (OSL) phenomena, for verification of its application in radiation dosimetry. An angular dependence study was performed in this work, using Al{sub 2}O{sub 3}:C commercial detectors, which were exposed to the radiation beams of a {sup 90}Sr +{sup 90}Y source from a beta radiation secondary standard system. The detectors were irradiated with an angle variation from -60° to +60°, and the results obtained using the TL and OSL techniques were within the international recommendation limits. (author)

  20. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Belzer, David B.; Halverson, Mark A.; Richman, Eric E.; Winiarski, David W.

    2002-09-30

    The state of Michigan is considering adpoting ASHRAE 90.1-1999 as its commercial building energy code. In an effort to evaluate whether or not this is an appropraite code for the state, the potential benefits and costs of adopting this standard are considered. Both qualitative and quantitative benefits are assessed. The energy simulation and economic results suggest that adopting ASHRAE 90.1-1999 would provide postitive net benefits to the state relative to the building and design requirements currently in place.

  1. Grading Class I Preparations in Preclinical Dental Education: E4D Compare Software vs. the Traditional Standard.

    Science.gov (United States)

    Sly, Marilia M; Barros, Juliana A; Streckfus, Charles F; Arriaga, Dianna M; Patel, Shalizeh A

    2017-12-01

    The aim of this study was to compare the effectiveness of a novel assessment software system with the traditional grading protocol used in the University of Texas School of Dentistry at Houston operative dentistry preclinical curriculum. In the study, conducted in 2016, 98 Class I preparations were evaluated both traditionally and digitally by two teams of calibrated preclinical faculty members (two evaluators for each team). Scores from each faculty pair were averaged for the traditional and the digital grading systems, and the scores for the two grading systems were compared. The analysis found no significant difference between the two grading systems with respect to isthmus width (p=0.073) and remaining marginal ridge (p=0.5841), but there was a significant difference with respect to pulpal floor depth assessment (psoftware offers a self-assessment tool for students to perfect their psychomotor skills while promoting independence and immediate feedback.

  2. Comparison of the Calibration Standards of Three Commercially Available Multiplex Kits for Human Cytokine Measurement to WHO Standards Reveals Striking Differences

    Directory of Open Access Journals (Sweden)

    Ivan M. Roitt

    2008-01-01

    Full Text Available Serum parameters as indicators for the efficacy of therapeutic drugs are currently in the focus of intensive research. The induction of certain cytokines (or cytokine patterns is known to be related to the status of the immune response e.g. in regulating the TH1/TH2 balance. Regarding their potential value as surrogate parameters in clinical trials and subsequently for the assignment of treatment effi cacy, the accurate and reliable determination of cytokines in patient serum is mandatory. Because serum samples are precious and limited, test methods—like the xMAP multiplex technology—that allow for the simultaneous determination of a variety of cytokines from only a small sample aliquot, can offer great advantages.We here have compared multiplex kits from three different manufactures and found striking differences upon standardizing using WHO standards for selected cytokines. We therefore extended our xMAP multiplex measurements investigations to an ex-vivo situation by testing serum samples and found that the cytokine amounts measured was critically influenced by the actual kit used. The presented data indicate that statements regarding the quantitive determination of cytokines—and therefore their use as biomarkers—in serum samples have to be interpreted with caution.

  3. Improvement in botanical standardization of commercial freeze-dried herbal extracts by using the combination of antioxidant capacity and constituent marker concentrations.

    Science.gov (United States)

    Ninfali, Paolino; Gennari, Lorenzo; Biagiotti, Enrica; Cangi, Francesca; Mattoli, Luisa; Maidecchi, Anna

    2009-01-01

    Botanical extracts are standardized to > or = 1 marker compounds (MCs). This standardization provides a certain level of quality control, but not complete quality assurance. Thus, industries are looking for other satisfactory systems to improve standardization. This study focuses on the standardization of herbal medicines by combining 2 parameters: the concentration of the MC and antioxidant capacity. Antioxidant capacity was determined with the oxygen radical absorbance capacity (ORAC) method and the concentrations of the MCs, by high-performance liquid chromatography. Total phenols were also determined by the Folin-Ciocolteau method. The ORAC values, expressed as micromol Trolox equivalents/100 g (ORAC %), of 12 commercial herbal extracts were related to the ORAC values of the respective pure MCs at the concentrations at which the MCs occur in products (ORAC-MC %). The ORAC % values of 11 extracts were higher than those of the respective MCs and the ratios ORAC-MC %/ORAC % ranged from 0.007 to 0.7, whereas in the case of Olea europaea leaves, the same ratio was 1.36. The ORAC parameters and their ratios, as well as the linear relationship between ORAC-MC % and ORAC %, are described and discussed as tools for improving the standardization of herbal products and detecting modifications due to herb processing and storage.

  4. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  5. WebM as an alternative to H.264? : Investigation of the usage of open source software and open standards

    NARCIS (Netherlands)

    Staalduinen, M. van; Prins, M.J.

    2011-01-01

    WebM is a new multimedia format often postitioned as an open and free to use alternative to the industry standard H.264. H.264 is currently the most popular web-video format, used for broadcast video, VoD but also Catch-Up TV services. Comparisons between WebM's VP8 video format and the H.264 format

  6. Commercial reference shape standards use in the study of particle shape effect on laser diffraction particle size analysis.

    Science.gov (United States)

    Kelly, Richard N; Kazanjian, Jacqueline

    2006-05-26

    The purpose of this paper is to describe the use of LGC Promochem AEA 1001 to AEA 1003 monosized fiber-analog shape standards in the study of the effect of particle shape on laser diffraction (LD) particle size analysis (psa). The psa of the AEA standards was conducted using LD psa systems from Beckman Coulter, Horiba, and Malvern Instruments. Flow speed settings, sample refractive index values, and sample cell types were varied to examine the extent to which the shape effect on LD psa results is modified by these variables. The volume and number probability plots resulting from these measurements were each characterized by a spread in the particle size distribution that roughly extended from the breadth to the longest dimension of the particles. For most of the selected sample refractive index values, the volume probability plots were characterized by apparent bimodal distributions. The results, therefore, provide experimental verification of the conclusions from theoretical studies of LD psa system response to monosized elliptical particles in which this apparent bimodality was the predicted result in the case of flow-oriented particles. The data support the findings from previous studies conducted over the past 10 years that have called into question the verity of the tenets of, and therefore the value of the application of, the equivalent spherical volume diameter theory and the random particle orientation model to the interpretation of LD psa results from measurements made on nonspherical particles.

  7. Comparison of Kayzero for Windows and k0-IAEA software packages for k0 standardization in neutron activation analysis

    Czech Academy of Sciences Publication Activity Database

    Kubešová, Marie; Kučera, Jan

    2011-01-01

    Roč. 654, č. 1 (2011), s. 206-212 ISSN 0168-9002 R&D Projects: GA ČR GA202/09/0363 Institutional research plan: CEZ:AV0Z10480505 Keywords : neutron activation analysis * ko standardization * Kayzero for Windows program * ko-IAEA program Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.207, year: 2011

  8. First-trimester risk calculation for trisomy 13, 18, and 21: comparison of the screening efficiency between 2 locally developed programs and commercial software.

    Science.gov (United States)

    Sørensen, Steen; Momsen, Günther; Sundberg, Karin; Friis-Hansen, Lennart; Jørgensen, Finn Stener

    2011-07-01

    Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free β-subunit of human chorionic gonadotropin (hCGβ), and pregnancy-associated plasma protein-A (PAPP-A) in maternal plasma from unaffected pregnancies. Means and SDs of these parameters in unaffected and affected pregnancies are used in the risk calculation program. Unfortunately, our commercial program for risk calculation (Astraia) did not allow use of local medians. We developed 2 alternative risk calculation programs to assess whether the screening efficacies for T13, T18, and T21 could be improved by using our locally estimated medians. We established these estimates from 19 594 women with singleton pregnancies and from 100 pregnant women carrying a fetus affected with trisomy (11 with T13, 23 with T18, and 66 with T21). All measured values were recalculated to a multiple of the median (MoM) and log(10) transformed; the mean and SD were calculated for each group. At a given risk cutoff value, we observed a slight improvement in detection rate (DR) for T13, T18, and T21 for a slightly higher false-positive rate (FPR) compared with the commercial program. The lower FPR in the commercial program was caused mainly by an inaccuracy in the PAPP-A median. Center-specific medians for NT, hCGβ, and PAPP-A should be used in risk calculation programs to ensure high DRs and low FPRs for all 3 trisomies at a given risk cutoff.

  9. First-trimester risk calculation for trisomy 13, 18, and 21: comparison of the screening efficiency between 2 locally developed programs and commercial software

    DEFF Research Database (Denmark)

    Sørensen, Steen; Momsen, Günther; Sundberg, Karin

    2011-01-01

    -A) in maternal plasma from unaffected pregnancies. Means and SDs of these parameters in unaffected and affected pregnancies are used in the risk calculation program. Unfortunately, our commercial program for risk calculation (Astraia) did not allow use of local medians. We developed 2 alternative risk......Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free ß-subunit of human chorionic gonadotropin (hCGß), and pregnancy-associated plasma protein-A (PAPP...

  10. Comparison of ultraviolet A light protection standards in the United States and European Union through in vitro measurements of commercially available sunscreens.

    Science.gov (United States)

    Wang, Steven Q; Xu, Haoming; Stanfield, Joseph W; Osterwalder, Uli; Herzog, Bernd

    2017-07-01

    The importance of adequate ultraviolet A light (UVA) protection has become apparent in recent years. The United States and Europe have different standards for assessing UVA protection in sunscreen products. We sought to measure the in vitro critical wavelength (CW) and UVA protection factor (PF) of commercially available US sunscreen products and see if they meet standards set by the United States and the European Union. Twenty sunscreen products with sun protection factors ranging from 15 to 100+ were analyzed. Two in vitro UVA protection tests were conducted in accordance with the 2011 US Food and Drug Administration final rule and the 2012 International Organization for Standardization method for sunscreen effectiveness testing. The CW of the tested sunscreens ranged from 367 to 382 nm, and the UVA PF of the products ranged from 6.1 to 32. Nineteen of 20 sunscreens (95%) met the US requirement of CW >370 nm. Eleven of 20 sunscreens (55%) met the EU desired ratio of UVA PF/SPF > 1:3. The study only evaluated a small number of sunscreen products. The majority of tested sunscreens offered adequate UVA protection according to US Food and Drug Administration guidelines for broad-spectrum status, but almost half of the sunscreens tested did not pass standards set in the European Union. Copyright © 2017. Published by Elsevier Inc.

  11. Evaluation of the botanical origin of commercial dry bee pollen load batches using pollen analysis: a proposal for technical standardization

    Directory of Open Access Journals (Sweden)

    Ortrud M. Barth

    2010-12-01

    Full Text Available High quality of bee pollen for commercial purpose is required. In order to attend the consumer with the best identification of the botanical and floral origin of the product, 25 bee pollen batches were investigated using two techniques of pollen grain preparation. The first started to identify pollen loads of different colors in two grams of each well mixed batch, and the second to identify pollen grains in a pool made of all the pollen loads comprised in two grams. The best result was obtained by this last technique, when a pollen grain suspension was dropped on a microscope slide and circa 500 pollen grains were counted per sample. This analysis resulted in the recognition of monofloral and bifloral pollen batches, while the use of the first technique resulted in all samples receiving a heterofloral diagnosis.É exigida alta qualidade para a comercialização de pólen apícola. A fim de atender o consumidor com a melhor identificação da origem botânica e floral do produto, 25 partidas de pólen apícola feram investigadas usande duas diferentes técnicas na preparação dos grãos de pólen. A primeira partiu da identificação das cargas polínicas contidas em dois gramas de cada partida bem misturada segundo suas cores. A segunda visava identificar os grãos de pólen de um agrupamento ("pool" de todas as cargas polínicas contidas em dois gramas de cada amostra. O melhor resultado foi obtido pela última técnica, quando uma suspensão de grãos de pólen era gotejada sobre uma lâmina de microscopia e cerca de 500 grãos de pólen eram centades por amostra. Esta análise resultou no reconhecimento de partidas monoflorais e biflorais de pólen apícola, enquanto que usando a primeira técnica, todas as amostras receberam a diagnose heterefloral.

  12. JPL Robotics Laboratory computer vision software library

    Science.gov (United States)

    Cunningham, R.

    1984-01-01

    The past ten years of research on computer vision have matured into a powerful real time system comprised of standardized commercial hardware, computers, and pipeline processing laboratory prototypes, supported by anextensive set of image processing algorithms. The software system was constructed to be transportable via the choice of a popular high level language (PASCAL) and a widely used computer (VAX-11/750), it comprises a whole realm of low level and high level processing software that has proven to be versatile for applications ranging from factory automation to space satellite tracking and grappling.

  13. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  14. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  15. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  16. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  17. Implementation of the k{sub 0}-standardization Method for an Instrumental Neutron Activation Analysis: Use-k{sub 0}-IAEA Software as a Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Ho, Manh Dung [Nuclear Research Institute, Dalat (Viet Nam)

    2006-03-15

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k{sub o}-NAA procedure.The k{sub o}-standardization method for an Neutron Activation Analysis(k{sub o}-NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k{sub o}-IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used.

  18. Implementation of the k0-standardization Method for an Instrumental Neutron Activation Analysis: Use-k0-IAEA Software as a Demonstration

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho; Ho, Manh Dung

    2006-03-01

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k o -NAA procedure.The k o -standardization method for an Neutron Activation Analysis(k o -NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k o -IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used

  19. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    Science.gov (United States)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for

  20. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  1. First-trimester risk calculation for trisomy 13, 18, and 21: comparison of the screening efficiency between 2 locally developed programs and commercial software

    DEFF Research Database (Denmark)

    Sørensen, Steen; Momsen, Günther; Sundberg, Karin

    2011-01-01

    Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free β-subunit of human chorionic gonadotropin (hCGβ), and pregnancy-associated plasma protein-A (PAPP-A) in mate......Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free β-subunit of human chorionic gonadotropin (hCGβ), and pregnancy-associated plasma protein-A (PAPP......-A) in maternal plasma from unaffected pregnancies. Means and SDs of these parameters in unaffected and affected pregnancies are used in the risk calculation program. Unfortunately, our commercial program for risk calculation (Astraia) did not allow use of local medians. We developed 2 alternative risk...... calculation programs to assess whether the screening efficacies for T13, T18, and T21 could be improved by using our locally estimated medians....

  2. First-trimester risk calculation for trisomy 13, 18, and 21: comparison of the screening efficiency between 2 locally developed programs and commercial software

    DEFF Research Database (Denmark)

    Sørensen, Steen; Momsen, Günther; Sundberg, Karin

    2011-01-01

    Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free ß-subunit of human chorionic gonadotropin (hCGß), and pregnancy-associated plasma protein-A (PAPP-A) in mate......Reliable individual risk calculation for trisomy (T) 13, 18, and 21 in first-trimester screening depends on good estimates of the medians for fetal nuchal translucency thickness (NT), free ß-subunit of human chorionic gonadotropin (hCGß), and pregnancy-associated plasma protein-A (PAPP......-A) in maternal plasma from unaffected pregnancies. Means and SDs of these parameters in unaffected and affected pregnancies are used in the risk calculation program. Unfortunately, our commercial program for risk calculation (Astraia) did not allow use of local medians. We developed 2 alternative risk...... calculation programs to assess whether the screening efficacies for T13, T18, and T21 could be improved by using our locally estimated medians....

  3. Emerging Radio and Manet Technology Study: Research Support for a Survey of State-of-the-art Commercial and Military Hardware/Software for Mobile Ad Hoc Networks

    Science.gov (United States)

    2014-10-01

    Architecture 2b: Tablet with External Router Existing products such as the ZuniConnect Travel Router or the ASUS 6-in-1 Wireless-N150 Mobile Router can...configuration and network topology information via a standard browser interface. The company claims scalability to 1000’s of units by removing routing...Systems (PS) is a US company based in New York City. They offer a proprietary mobile ad-hoc networking solution (Wave Relay) that is an adaptive mesh

  4. A Framework for Instituting Software Metrics in Small Software Organizations

    OpenAIRE

    Hisham M. Haddad; Nancy C. Ross; Donald E. Meredith

    2012-01-01

    The role of metrics in software quality is well-recognized; however, software metrics are yet to be standardized and integrated into development practices across the software industry. Literature reports indicate that software companies with less than 50 employees may represent up to 85% of the software organizations in several countries, including the United States. While process, project, and product metrics share a common goal of contributing to software quality and reliability, utilizatio...

  5. Usability in open source software development

    DEFF Research Database (Denmark)

    Andreasen, M. S.; Nielsen, H. V.; Schrøder, S. O.

    2006-01-01

    Open Source Software (OSS) development has gained significant importance in the production of soft-ware products. Open Source Software developers have produced systems with a functionality that is competitive with similar proprietary software developed by commercial software organizations. Yet OSS...

  6. Development of image quality assurance measures of the ExacTrac localization system using commercially available image evaluation software and hardware for image-guided radiotherapy.

    Science.gov (United States)

    Stanley, Dennis N; Papanikolaou, Nikos; Gutiérrez, Alonso N

    2014-11-08

    Quality assurance (QA) of the image quality for image-guided localization systems is crucial to ensure accurate visualization and localization of target volumes. In this study, a methodology was developed to assess and evaluate the constancy of the high-contrast spatial resolution, dose, energy, contrast, and geometrical accuracy of the BrainLAB ExacTrac system. An in-house fixation device was constructed to hold the QCkV-1 phantom firmly and reproducibly against the face of the flat panel detectors. Two image sets per detector were acquired using ExacTrac preset console settings over a period of three months. The image sets were analyzed in PIPSpro and the following metrics were recorded: high-contrast spatial resolution (f30, f40, f50 (lp/mm)), noise, and contrast-to-noise ratio. Geometrical image accu- racy was evaluated by assessing the length between to predetermined points of the QCkV-1 phantom. Dose and kVp were recorded using the Unfors RaySafe Xi R/F Detector. The kVp and dose were evaluated for the following: Cranial Standard (CS) (80 kV,80 mA,80 ms), Thorax Standard (TS) (120 kV,160 mA,160 ms), Abdomen Standard (AS) (120 kV,160 mA,130 ms), and Pelvis Standard (PS) (120 kV,160 mA,160 ms). With regard to high-contrast spatial resolution, the mean values of the f30 (lp/mm), f40 (lp/mm) and f50 (lp/mm) for the left detector were 1.39 ± 0.04, 1.24 ± 0.05, and 1.09 ± 0.04, respectively, while for the right detector they were 1.38 ± 0.04, 1.22 ± 0.05, and 1.09 ± 0.05, respectively. Mean CNRs for the left and right detectors were 148 ± 3 and 143 ± 4, respectively. For geometrical accuracy, both detectors had a measured image length of the QCkV-1 of 57.9 ± 0.5 mm. The left detector showed dose measurements of 20.4 ± 0.2 μGy (CS), 191.8 ± 0.7 μGy (TS), 154.2 ± 0.7 μGy (AS), and 192.2 ± 0.6 μGy (PS), while the right detector showed 20.3 ± 0.3 μGy (CS), 189.7 ± 0.8 μGy (TS), 151.0 ± 0.7 μGy (AS), and 189.7 ± 0.8 μGy (PS), respectively. For X

  7. Characterization of Individual Isopropylated and tert-Butylated Triarylphosphate (ITP and TBPP) Isomers in Several Commercial Flame Retardant Mixtures and House Dust Standard Reference Material SRM 2585.

    Science.gov (United States)

    Phillips, Allison L; Hammel, Stephanie C; Konstantinov, Alex; Stapleton, Heather M

    2017-11-21

    Since the phase-out of pentaBDE in the early 2000s, replacement flame-retardant mixtures including Firemaster 550 (FM 550), Firemaster 600 (FM 600), and organophosphate aryl ester technical mixtures have been increasingly used to treat polyurethane foam in residential upholstered furniture. These mixtures contain isomers of isopropylated and tert-butylated triarylphosphate esters (ITPs and TBPPs), which have similar or greater neuro- and developmental toxicity compared to BDE 47 in high-throughput assays. Additionally, human exposure to ITPs and TBPPs has been demonstrated to be widespread in several recent studies; however, the relative composition of these mixtures has remained largely uncharacterized. Using available authentic standards, the present study quantified the contribution of individual ITP and TBPP isomers in four commercial flame retardant mixtures: FM 550, FM 600, an ITP mixture, and a TBPP mixture. Findings suggest similarities between FM 550 and the ITP mixture, with 2-isopropylphenyl diphenyl phosphate (2IPPDPP), 2,4-diisopropylphenyl diphenyl phosphate (24DIPPDPP), and bis(2-isopropylphenyl) phenyl phosphate (B2IPPPP) being the most prevalent ITP isomers in both mixtures. FM 600 differed from FM 550 in that it contained TBPP isomers instead of ITP isomers. These analytes were also detected and quantified in a house dust standard reference material, SRM 2585, demonstrating their environmental relevance.

  8. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  9. Comparison of Protein Value of Commercial Baby Food with Homemade Baby Food and Casein Standard in Rats as the Refference point

    Directory of Open Access Journals (Sweden)

    Z. Asemi

    2008-10-01

    Full Text Available Background and ObjectivesEvaluation of protein quality in food is of great importance due to the biological and economical impacts of food proteins. This study has been conducted with the aim of comparing the protein quality of homemade food (mixture of macaroni and soy bean with commercial baby food (Cerelac Wheat using Casein as the refference point.MethodsThis study was conducted on 64 twenty one day old male Wistar rats. The rats were divided into 8 groups, and each group was put on a different diet regiments. The diet regiments were as follow: 2 homemade food+Cerelac test diet, 1 Ccasein+Methionine standard diet, 1 protien-free basal diet, 2 test diet, 1 standard diet and 1 basal diet. The purpose of protien-free diet was to evaluate True Protien Digestability (TPD. Net Protein Ratio (NPR and Protien Efficiency Ratios (PER were investigated by the basal diet. Protein intake and increasing of weight were determined for NPR and PER calculating. Nitrogen intake and fecal Nitrogen were determined to calculate TPD. Comparison of TPD, NPR and PER among the groups were analyzed by ANOVA and Tukey methods.ResultsTPD values of Standard, Cerelac and homemade food diets were 92.8±4, 87±8 and 85.4±3.2; NPR values were 4.3±0.4, 4.3±0.9, 3.8±0.6; and PER values were 3±0.2, 2.5±0.4, 1.7±0.1 respectively. The statistical difference between TPD and PER values were significant (p 0.05. ConclusionThese results shows that TPD and PER of homemade foods are lower than Cerelac while their NPR are acceptable.Keywords: Protein; Cerelac; Macaroni; Soybeens.

  10. Physicochemical standardization, HPTLC profiling, and biological evaluation of Aśvagandhādyariṣṭa: A comparative study of three famous commercial brands

    Science.gov (United States)

    Singh, Mandeep; Kaur, Navdeep; Paul, Atish Tulsiram

    2014-01-01

    Background: Aśvagandhādyariṣṭa is a polyherbal formulation that is available commercially as an over the counter drug. There are three famous brands that are available in the market. However, there are no comparative reports on the physicochemical, chromatographic, and biological profiles of Aśvagandhādyariṣṭa manufactured by these famous companies. Aims: The present study deals with the physicochemical standardization, high performance thin layer chromatography (HPTLC) profiling, and biological evaluation of Aśvagandhādyariṣṭa. Materials and Methods: Aśvagandhādyariṣṭa manufactured by three leading companies were purchased from Jalandhar, Punjab. The physicochemical standardization of the samples was carried out in accordance with the Ayurvedic Pharmacopoeia of India (API). Authentified Eisenia foetida were procured from Ujjwal Ujala Vermiculture Group, Amritsar. The anthelmintic activity, 1,1-diphenyl-2-picrylhydrazyl scavenging, and hydrogen peroxide scavenging ability of Aśvagandhādyariṣṭa was determined. Statistical Analysis Used: The data of anthelmintic activity were expressed as mean ± standard error of the mean of six earthworms in each group. The statistical analysis was carried out using one-way analysis of variance, followed by Dunnet t-test. The difference in values at P foetida. ASA-DAB showed the best antioxidant activity in both the in vitro assay at the concentration of 100 μg/ml. Conclusions: The ability of this formulation to scavenge free radicals supports its medical claim of antistress formulation. The anthelmintic potential of this formulation helps us conclude that it can also be considered as a general tonic because it provides relief from helminths. PMID:25538352

  11. Efficacy of standard versus enhanced features in a Web-based commercial weight-loss program for obese adults, part 2: randomized controlled trial.

    Science.gov (United States)

    Collins, Clare E; Morgan, Philip J; Hutchesson, Melinda J; Callister, Robin

    2013-07-22

    Commercial Web-based weight-loss programs are becoming more popular and increasingly refined through the addition of enhanced features, yet few randomized controlled trials (RCTs) have independently and rigorously evaluated the efficacy of these commercial programs or additional features. To determine whether overweight and obese adults randomized to an online weight-loss program with additional support features (enhanced) experienced a greater reduction in body mass index (BMI) and increased usage of program features after 12 and 24 weeks compared to those randomized to a standard online version (basic). An assessor-blinded RCT comparing 301 adults (male: n=125, 41.5%; mean age: 41.9 years, SD 10.2; mean BMI: 32.2 kg/m(2), SD 3.9) who were recruited and enrolled offline, and randomly allocated to basic or enhanced versions of a commercially available Web-based weight-loss program for 24 weeks. Retention at 24 weeks was greater in the enhanced group versus the basic group (basic 68.5%, enhanced 81.0%; P=.01). In the intention-to-treat analysis of covariance with imputation using last observation carried forward, after 24 weeks both intervention groups had reductions in key outcomes with no difference between groups: BMI (basic mean -1.1 kg/m(2), SD 1.5; enhanced mean -1.3 kg/m(2), SD 2.0; P=.29), weight (basic mean -3.3 kg, SD 4.7; enhanced mean -4.0 kg, SD 6.2; P=.27), waist circumference (basic mean -3.1 cm, SD 4.6; enhanced mean -4.0 cm, SD 6.2; P=.15), and waist-to-height ratio (basic mean -0.02, SD 0.03; enhanced mean -0.02, SD 0.04, P=.21). The enhanced group logged in more often at both 12 and 24 weeks, respectively (enhanced 12-week mean 34.1, SD 28.1 and 24-week mean 43.1, SD 34.0 vs basic 12-week mean 24.6, SD 25.5 and 24-week mean 31.8, SD 33.9; P=.002). The addition of personalized e-feedback in the enhanced program provided limited additional benefits compared to a standard commercial Web-based weight-loss program. However, it does support greater

  12. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  13. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  14. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  15. Cost effectiveness of primary care referral to a commercial provider for weight loss treatment, relative to standard care: a modelled lifetime analysis.

    Science.gov (United States)

    Fuller, N R; Carter, H; Schofield, D; Hauner, H; Jebb, S A; Colagiuri, S; Caterson, I D

    2014-08-01

    Because of the high prevalence of overweight and obesity, there is a need to identify cost-effective approaches for weight loss in primary care and community settings. To evaluate the long-term cost effectiveness of a commercial weight loss programme (Weight Watchers) (CP) compared with standard care (SC), as defined by national guidelines. A Markov model was developed to calculate the incremental cost-effectiveness ratio (ICER), expressed as the cost per quality-adjusted life year (QALY) over the lifetime. The probabilities and quality-of-life utilities of outcomes were extrapolated from trial data using estimates from the published literature. A health sector perspective was adopted. Over a patient's lifetime, the CP resulted in an incremental cost saving of AUD 70 per patient, and an incremental 0.03 QALYs gained per patient. As such, the CP was found to be the dominant treatment, being more effective and less costly than SC (95% confidence interval: dominant to 6225 per QALY). Despite the CP delaying the onset of diabetes by ∼10 months, there was no significant difference in the incidence of type 2 diabetes, with the CP achieving <0.1% fewer cases than SC over the lifetime. The modelled results suggest that referral to community-based interventions may provide a highly cost-effective approach for those at high risk of weight-related comorbidities.

  16. Quantifying the performance of two different types of commercial software programs for 3D patient dose reconstruction for prostate cancer patients: Machine log files vs. machine log files with EPID images.

    Science.gov (United States)

    Kadoya, Noriyuki; Kon, Yoshio; Takayama, Yoshiki; Matsumoto, Takuya; Hayashi, Naoki; Katsuta, Yoshiyuki; Ito, Kengo; Chiba, Takahito; Dobashi, Suguru; Takeda, Ken; Jingu, Keiichi

    2018-01-01

    We clarified the reconstructed 3D dose difference between two different commercial software programs (Mobius3D v2.0 and PerFRACTION v1.6.4). Five prostate cancer patients treated with IMRT (74 Gy/37 Fr) were studied. Log files and cine EPID images were acquired for each fraction. 3D patient dose was reconstructed using log files (Mobius3D) or log files with EPID imaging (PerFRACTION). The treatment planning dose was re-calculated on homogeneous and heterogeneous phantoms, and log files and cine EPID images were acquired. Measured doses were compared with the reconstructed point doses in the phantom. Next, we compared dosimetric metrics (mean dose for PTV, rectum, and bladder) calculated by Mobius3D and PerFRACTION for all fractions from five patients. Dose difference at isocenter between measurement and reconstructed dose for two software programs was within 3.0% in both homogeneous and heterogeneous phantoms. Moreover, the dose difference was larger using skip arc plan than that using full arc plan, especially for PerFRACTION (e.g., dose difference at isocenter for PerFRACTION: 0.34% for full arc plan vs. -4.50% for skip arc plan in patient 1). For patients, differences in dosimetric parameters were within 1% for almost all fractions. PerFRACTION had wider range of dose difference between first fraction and the other fractions than Mobius3D (e.g., maximum difference: 0.50% for Mobius3D vs. 1.85% for PerFRACTION), possibly because EPID may detect some types of MLC positioning errors such as miscalibration errors or mechanical backlash which cannot be detected by log files, or that EPID data might include image acquisition failure and image noise. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  18. Comparison of Protein Value of Commercial Baby Food with Homemade Baby Food and Casein Standard in Rats as the Refference point

    Directory of Open Access Journals (Sweden)

    Z Asemi

    2012-05-01

    Full Text Available

    Background and Objectives

    Evaluation of protein quality in food is of great importance due to the  biological and economical impacts of food proteins. This study has been conducted with the aim of comparing the protein quality of  homemade food (mixture of macaroni and soy bean with commercial baby food (Cerelac Wheat using Casein as the refference point.

     

    Methods

    This study was conducted on 64 twenty one day old male Wistar rats. The rats were divided into 8 groups, and each group was put on a different diet regiments. The diet regiments were as follow: 2 homemade food+Cerelac test diet, 1 Ccasein+Methionine standard diet, 1 protien-free basal diet, 2 test diet, 1 standard diet and 1 basal diet. The purpose of protien-free diet was  to evaluate True Protien Digestability (TPD. Net Protein Ratio (NPR and Protien Efficiency Ratios (PER were investigated by the basal diet. Protein intake and increasing of weight were determined for NPR and PER calculating. Nitrogen intake and fecal Nitrogen were determined to calculate TPD. Comparison of TPD, NPR and PER among the groups were analyzed by ANOVA and Tukey methods.

     

    Results

    TPD values of Standard, Cerelac and homemade food diets were 92.8±4, 87±8 and 85.4±3.2; NPR values were 4.3±0.4, 4.3±0.9, 3.8±0.6; and PER values were 3±0.2, 2.5±0.4, 1.7±0.1 respectively. The statistical difference between TPD and PER values were significant (p < 0.05, whereas NPR differences were insignificant ( p > 0.05.

     

    Conclusion

    These results shows that TPD and PER of homemade foods are lower than Cerelac while their NPR are acceptable

  19. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  20. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    Science.gov (United States)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  1. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  2. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  3. Development of standardized approaches to reporting of minimal residual disease data using a reporting software package designed within the European LeukemiaNet

    DEFF Research Database (Denmark)

    Østergaard, M; Nyvold, Charlotte Guldborg; Jovanovic, J V

    2011-01-01

    and presentation that complicate multicenter clinical trials. To address these issues, we designed a highly flexible MRD-reporting software program, in which data from various qPCR platforms can be imported, processed, and presented in a uniform manner to generate intuitively understandable reports. The software...

  4. Dose-response evaluation of the standardized ileal digestible tryptophan : lysine ratio to maximize growth performance of growing-finishing gilts under commercial conditions.

    Science.gov (United States)

    Gonçalves, M A D; Tokach, M D; Bello, N M; Touchette, K J; Goodband, R D; DeRouchey, J M; Woodworth, J C; Dritz, S S

    2017-11-16

    Environmental regulations as well as economic incentives have resulted in greater use of synthetic amino acids in swine diets. Tryptophan is typically the second limiting amino acid in corn-soybean meal-based diets. However, using corn-based co-products emphasizes the need to evaluate the pig's response to increasing Trp concentrations. Therefore, the objective of these studies was to evaluate the dose-response to increasing standardized ileal digestible (SID) Trp : Lys on growth performance of growing-finishing gilts housed under large-scale commercial conditions. Dietary treatments consisted of SID Trp : Lys of 14.5%, 16.5%, 18.0%, 19.5%, 21.0%, 22.5% and 24.5%. The study was conducted in four experiments of 21 days of duration each, and used corn-soybean meal-based diets with 30% distillers dried grains with solubles. A total of 1166, 1099, 1132 and 975 gilts (PIC 337×1050, initially 29.9±2.0 kg, 55.5±4.8 kg, 71.2±3.4 kg and 106.2±3.1 kg BW, mean±SD) were used. Within each experiment, pens of gilts were blocked by BW and assigned to one of the seven dietary treatments and six pens per treatment with 20 to 28 gilts/pen. First, generalized linear mixed models were fit to data from each experiment to characterize performance. Next, data were modeled across experiments and fit competing dose-response linear and non-linear models and estimate SID Trp : Lys break points or maximums for performance. Competing models included broken-line linear (BLL), broken-line quadratic and quadratic polynomial (QP). For average daily gain (ADG), increasing the SID Trp : Lys increased growth rate in a quadratic manner (Pgilts ranged from a minimum of 16.9% for maximum G : F to 23.5% for maximum ADG.

  5. Comparative Analysis of Selected High Frequency Words Found in Commercial Spelling Series and Misspelled in Students' Writing to a Standard Measure of Word Frequency.

    Science.gov (United States)

    Hagerty, Patricia Jo

    A major purpose of this study was to determine whether a selected number of current, commercially prepared spelling series used high frequency words for their word lists. A second purpose was to determine whether students misspelled high frequency words in their writing. Eleven commercially prepared spelling series were selected according to the…

  6. Report on the Observance of Standards and Codes, Accounting and Auditing : Module B - Institutional Framework for Corporate Financial Reporting, B.1 Commercial Enterprises (including SMEs)

    OpenAIRE

    World Bank

    2017-01-01

    The purpose of this report is to gain an understanding of the general financial reporting and audit requirements for commercial enterprises in a jurisdiction as established by law or other regulation (for example, companies’ act). Commercial enterprises are defined as companies established with a profit-making objective that do not issue equity and debt on a public exchange, are not financ...

  7. Professional Issues In Software Engineering

    CERN Document Server

    Bott, Frank; Eaton, Jack; Rowland, Diane

    2000-01-01

    An comprehensive text covering all the issues that software engineers now have to take into account apart from the technical side of things. Includes information on the legal, professional and commercial context in which they work.

  8. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  9. Design of software for calculation of shielding based on various standards radiodiagnostic calculation; Diseno de un software para el calculo de blindajes en radiodiagnostico basado en varios estandares de calculo

    Energy Technology Data Exchange (ETDEWEB)

    Falero, B.; Bueno, P.; Chaves, M. A.; Ordiales, J. M.; Villafana, O.; Gonzalez, M. J.

    2013-07-01

    The aim of this study was to develop a software application that performs calculation shields in radiology room depending on the type of equipment. The calculation will be done by selecting the user, the method proposed in the Guide 5.11, the Report 144 and 147 and also for the methodology given by the Portuguese Health Ministry. (Author)

  10. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  11. Impacts of Reinsurance Operations on Significant Items of the Financial Statements of Commercial Insurance Companies According to Czech Accounting Legislation and International Accounting Standards

    Directory of Open Access Journals (Sweden)

    Jana Gláserová

    2015-01-01

    Full Text Available The principal aim of the paper is to determine the impact of reinsurance operations in commercial insurance companies, in accordance with the relevant accounting legislation, for certain significant items of the financial statements. In actual fact, the reinsurance operations affect the profit of a commercial insurance company, following the financial statements. The prerequisite for fulfilling the objective of the paper is to analyse the accounting legislation for reinsurance operations in commercial insurance companies. Attention will be devoted also to the method of accounting for reinsurance operations and their specific reporting in various parts of the financial statements of commercial insurance companies. The partial aim of this paper is to identify significant differences in the area of accounting of commercial insurance companies, based on the comparison of accounting practices of the issues examined in accordance with IAS/IFRS. In the conclusion, the authors will address the latest development of necessary steps in adopting the concept of IFRS 4 Phase II and accomplishing the process of the application of IFRS 4 Phase II to the accounts of commercial insurance companies.

  12. Development of a methodology and software for analysis of energy and economic feasibility of introducing natural gas facilities in residential an commercial sector; Desenvolvimento de metodologia e de software para analise de viabilidade energetica e economica da introducao de instalacoes para gas natural no setor residencial e comercial

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, Marcos Fabio de; Torres, Ednildo Andrade [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Escola Politecnica. Lab. de Energia e Gas; Santos, Carlos Antonio Cabral dos [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Lab. de Energia Solar; Campos, Michel Fabianski [PETROBRAS, Rio de Janeiro, RJ (Brazil). RedeGasEnergia

    2004-07-01

    With the increasing participation of the natural gas in the world-wide and national energy matrix, beyond the constant search for an alternative source of energy that has an acceptable behavior of the ambient point of view, they become each time more necessary studies to make possible the expansion of the use of this fuel in the diverse energy sectors, such as: Industrial, advertising, residential, to propagate, among others; Of these sectors, the residential one is what more it needs innovations and/or technological adaptations to exert a massive participation in the demand of the natural gas. This work has as objective to establish a methodology adjusted for analysis of the energy and economic viability of the introduction of installations for natural gas in the residential and commercial sector, as well as the implementation of a software that will more facilitate to the taking of decisions of this the confection of the plant low of the enterprise until the choice of the adjusted material for the installation of the tubing, besides showing to the viability technique - economic of the use of the natural gas for supplying all even though the energy necessities of this construction or of its joint participation with the electric energy or with the GLP. The methodology will mainly have support in first and the second law of the thermodynamics, beyond the norms Brazilian techniques that conduct this sector of the civil construction, taking in consideration the fixed and changeable costs of the energy construction of the construction and the involved ones. One expects, on the basis of the literature, that the introduction of installations for natural gas in the residential and commercial sector presents viability economic technique and, increasing with this the demand of this fuel and consequently its participation in the national energy matrix. (author)

  13. Agent Standards Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the work herein proposed is the development of standards for software autonomous agents. These standards are essential to achieve software...

  14. Forever software

    NARCIS (Netherlands)

    Rensink, Arend; Margaria, Tiziana; Steffen, Bernhard

    2014-01-01

    Any attempt to explain software engineering to a lay audience soon falls back on analogy: building software is like building a bridge, a car, a television set. A large part of the established practice within software engineering is also based on this premise. However, the analogy is false in some

  15. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  16. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  17. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  18. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  19. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  20. jMRUI plugin software (jMRUI2XML) to allow automated MRS processing and XML-based standardized output

    Czech Academy of Sciences Publication Activity Database

    Mocioiu, V.; Ortega-Martorell, S.; Olier, I.; Jabłoński, Michal; Starčuková, Jana; Lisboa, P.; Arús, C.; Julia-Sapé, M.

    2015-01-01

    Roč. 28, S1 (2015), S518 ISSN 0968-5243. [ESMRMB 2015. Annual Scientific Meeting /32./. 01.09.2015-03.09.2015, Edinburgh] Institutional support: RVO:68081731 Keywords : MR Spectroscopy * signal processing * jMRUI * software development * XML Subject RIV: BH - Optics, Masers, Lasers

  1. Software System for the Calibration of X-Ray Measuring Instruments

    International Nuclear Information System (INIS)

    Gaytan-Gallardo, E.; Tovar-Munoz, V. M.; Cruz-Estrada, P.; Vergara-Martinez, F. J.; Rivero-Gutierrez, T.

    2006-01-01

    A software system that facilities the calibration of X-ray measuring instruments used in medical applications is presented. The Secondary Standard Dosimetry Laboratory (SSDL) of the Nuclear Research National Institute in Mexico (ININ in Spanish), supports activities concerning with ionizing radiations in medical area. One of these activities is the calibration of X-ray measuring instruments, in terms of air kerma or exposure by substitution method in an X-ray beam at a point where the rate has been determined by means of a standard ionization chamber. To automatize this process, a software system has been developed, the calibration system is composed by an X-ray unit, a Dynalizer IIIU X-ray meter by RADCAL, a commercial data acquisition card, the software system and the units to be tested and calibrated. A quality control plan has been applied in the development of the software system, ensuring that quality assurance procedures and standards are being followed

  2. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  3. Method of V ampersand V for safety-critical software in NPPs

    International Nuclear Information System (INIS)

    Kim, Jang-Yeol; Lee, Jang-Soo; Kwon, Kee-Choon

    1997-01-01

    Safety-critical software is software used in systems in which a failure could affect personal or equipment safety or result in large financial or social loss. Examples of systems using safety-critical software are systems such as plant protection systems in nuclear power plants (NPPs), process control systems in chemical plants, and medical instruments such as the Therac-25 medical accelerator. This paper presents verification and validation (V ampersand V) methodology for safety-critical software in NPP safety systems. In addition, it addresses issues related to NPP safety systems, such as independence parameters, software safety analysis (SSA) concepts, commercial off-the-shelf (COTS) software evaluation criteria, and interrelationships among software and system assurance organizations. It includes the concepts of existing industrial standards on software V ampersand V, Institute of Electrical and Electronics Engineers (IEEE) Standards 1012 and 1059. This safety-critical software V ampersand V methodology covers V ampersand V scope, a regulatory framework as part of its acceptance criteria, V ampersand V activities and task entrance and exit criteria, reviews and audits, testing and quality assurance records of V ampersand V material, configuration management activities related to V ampersand V, and software V ampersand V (SVV) plan (SVVP) production

  4. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  5. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  6. Different Views of Software Agent Paradigm

    OpenAIRE

    Anand Kumar Pandey; Rashmi Pandey

    2012-01-01

    Agent modelling in software engineering is a relatively young area, and there are, as yet, no standard methodologies, development tools, or software architectures. Although our work is emphasized on to represent the agent specifications using different views, they are not oriented for software engineering in terms of providing a modeling notation that directly supports software development. There are, however, some software frameworks using different views of software agents, that are beginni...

  7. CONRAD Software Architecture

    Science.gov (United States)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  8. A study on the establishment of safety assessment guidelines of commercial grade item dedication in digitalized safety systems

    International Nuclear Information System (INIS)

    Hwang, H. S.; Kim, B. R.; Oh, S. H.

    1999-01-01

    Because of obsolescing the components used in safety related systems of nuclear power plants, decreasing the number of suppliers qualified for the nuclear QA program and increasing maintenance costs of them, utilities have been considering to use commercial grade digital computers as an alternative for resolving such issues. However, commercial digital computers use the embedded pre-existing software, including operating system software, which are not developed by using nuclear grade QA program. Thus, it is necessary for utilities to establish processes for dedicating digital commercial grade items. A regulatory body also needs guidance to evaluate the digital commercial products properly. This paper surveyed the regulations and their regulatory guides, which establish the requirements for commercial grade items dedication, industry standards and guidances applicable to safety related systems. This paper provides some guidelines to be applied in evaluating the safety of digital upgrades and new digital plant protection systems in Korea

  9. Modernization of tank floor scanning system (TAFLOSS) Software

    International Nuclear Information System (INIS)

    Mohd Fitri Abd Rahman; Jaafar Abdullah; Zainul A Hassan

    2002-01-01

    The main objective of the project is to develop new user-friendly software that combined the second-generation software (developed in-house) and commercial software. This paper describes the development of computer codes for analysing the initial data and plotting exponential curve fit. The method that used in curve fitting is least square technique. The software that had been developed is capable to give a comparable result as the commercial software. (Author)

  10. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  11. Software qualification in safety applications

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    2000-01-01

    The developers of safety-critical instrumentation and control systems must qualify the design of the components used, including the software in the embedded computer systems, in order to ensure that the component can be trusted to perform its safety function under the full range of operating conditions. There are well known ways to qualify analog systems using the facts that: (1) they are built from standard modules with known properties; (2) design documents are available and described in a well understood language; (3) the performance of the component is constrained by physics; and (4) physics models exist to predict the performance. These properties are not generally available for qualifying software, and one must fall back on extensive testing and qualification of the design process. Neither of these is completely satisfactory. The research reported here is exploring an alternative approach that is intended to permit qualification for an important subset of instrumentation software. The research goal is to determine if a combination of static analysis and limited testing can be used to qualify a class of simple, but practical, computer-based instrumentation components for safety application. These components are of roughly the complexity of a motion detector alarm controller. This goal is accomplished by identifying design constraints that enable meaningful analysis and testing. Once such design constraints are identified, digital systems can be designed to allow for analysis and testing, or existing systems may be tested for conformance to the design constraints as a first step in a qualification process. This will considerably reduce the cost and monetary risk involved in qualifying commercial components for safety-critical service

  12. Commercial Toilets

    Science.gov (United States)

    Whether you are looking to reduce water use in a new facility or replace old, inefficient toilets in commercial restrooms, a WaterSense labeled flushometer-valve toilet is a high-performance, water-efficient option worth considering.

  13. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  14. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  15. Space Commercialization

    Science.gov (United States)

    Martin, Gary L.

    2011-01-01

    A robust and competitive commercial space sector is vital to continued progress in space. The United States is committed to encouraging and facilitating the growth of a U.S. commercial space sector that supports U.S. needs, is globally competitive, and advances U.S. leadership in the generation of new markets and innovation-driven entrepreneurship. Energize competitive domestic industries to participate in global markets and advance the development of: satellite manufacturing; satellite-based services; space launch; terrestrial applications; and increased entrepreneurship. Purchase and use commercial space capabilities and services to the maximum practical extent Actively explore the use of inventive, nontraditional arrangements for acquiring commercial space goods and services to meet United States Government requirements, including measures such as public-private partnerships, . Refrain from conducting United States Government space activities that preclude, discourage, or compete with U.S. commercial space activities. Pursue potential opportunities for transferring routine, operational space functions to the commercial space sector where beneficial and cost-effective.

  16. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  17. Differences in serum thyroglobulin measurements by 3 commercial immunoradiometric assay kits and laboratory standardization using Certified Reference Material 457 (CRM-457).

    Science.gov (United States)

    Lee, Ji In; Kim, Ji Young; Choi, Joon Young; Kim, Hee Kyung; Jang, Hye Won; Hur, Kyu Yeon; Kim, Jae Hyeon; Kim, Kwang-Won; Chung, Jae Hoon; Kim, Sun Wook

    2010-09-01

    Serum thyroglobulin (Tg) is essential in the follow-up of patients with differentiated thyroid carcinoma (DTC). However, interchangeability and standardization between Tg assays have not yet been achieved, even with the development of an international Tg standard (Certified Reference Material 457 [CRM-457]). Serum Tg from 30 DTC patients and serially diluted CRM-457 were measured using 3 different immunoradiometric assays (IRMA-1, IRMA-2, IRMA-3). The intraclass correlation coefficient (ICC) method was used to describe the concordance of each IRMA to CRM-457. The serum Tg measured by 3 different IRMAs correlated well (r > .85, p CRM-457, showed the best ICC (p(1) = .98) for the CRM-457. Hospitals caring for patients with DTC should either set their own cutoffs for IRMAs for Tg based on their patient pools, or adopt IRMAs standardized to CRM-457 and calibrate their laboratory using CRM-457.

  18. Software Reviews.

    Science.gov (United States)

    McGrath, Diane

    1990-01-01

    Reviews two programs: (1) "The Weather Machine" on understanding weather and weather forecasting and (2) "The Mystery of the Hotel Victoria" on problem solving in mathematics. Presents the descriptions, advantages, and weaknesses of the software. (YP)

  19. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  20. Software Reviews.

    Science.gov (United States)

    Slatta, Richard W. And Others

    1987-01-01

    Describes a variety of computer software. Subjects reviewed include history simulations and wordprocessing programs. Some of the eleven packages reviewed are Thog, North Utilities, HBJ Writer, Textra, Pro-cite, and Simulation Construction Kit. (BSR)

  1. Software Reviews.

    Science.gov (United States)

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)

  2. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  3. Software archeology: a case study in software quality assurance and design

    Energy Technology Data Exchange (ETDEWEB)

    Macdonald, John M [Los Alamos National Laboratory; Lloyd, Jane A [Los Alamos National Laboratory; Turner, Cameron J [COLORADO SCHOOL OF MINES

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  4. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...

  5. Ground control station software design for micro aerial vehicles

    Science.gov (United States)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  6. Weighting Factors for the Commercial Building Prototypes Used in the Development of ANSI/ASHRAE/IESNA Standard 90.1-2010

    Energy Technology Data Exchange (ETDEWEB)

    Jarnagin, Ronald E.; Bandyopadhyay, Gopal K.

    2010-01-21

    Detailed construction data from the McGraw Hill Construction Database was used to develop construction weights by climate zones for use with DOE Benchmark Buildings and for the ASHRAE Standard 90.1-2010 development. These construction weights were applied to energy savings estimates from simulation of the benchmark buildings to establish weighted national energy savings.

  7. VMStools: Open-source software for the processing, analysis and visualization of fisheries logbook and VMS data

    DEFF Research Database (Denmark)

    Hintzen, Niels T.; Bastardie, Francois; Beare, Doug

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook (...

  8. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  9. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  10. Safety Review related to Commercial Grade Digital Equipment in Safety System

    International Nuclear Information System (INIS)

    Yu, Yeongjin; Park, Hyunshin; Yu, Yeongjin; Lee, Jaeheung

    2013-01-01

    The upgrades or replacement of I and C systems on safety system typically involve digital equipment developed in accordance with non-nuclear standards. However, the use of commercial grade digital equipment could include the vulnerability for software common-mode failure, electromagnetic interference and unanticipated problems. Although guidelines and standards for dedication methods of commercial grade digital equipment are provided, there are some difficulties to apply the methods to commercial grade digital equipment for safety system. This paper focuses on regulatory guidelines and relevant documents for commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. This paper focuses on KINS regulatory guides and relevant documents for dedication of commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. Dedication including critical characteristics is required to use the commercial grade digital equipment on safety system in accordance with KEPIC ENB 6370 and EPRI TR-106439. The dedication process should be controlled in a configuration management process. Appropriate methods, criteria and evaluation result should be provided to verify acceptability of the commercial digital equipment used for safety function

  11. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    This article presents MIAWARE, a software for Medical Image Analysis With Automated Reporting Engine, which was designed and developed for doctor/radiologist assistance. It allows to analyze an image stack from computed axial tomography scan of lungs (thorax) and, at the same time, to mark all...... is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...... pathologies on images and report their characteristics. The reporting process is normalized - radiologists cannot describe pathological changes with their own words, but can only use some terms from a specific vocabulary set provided by the software. Consequently, a normalized radiological report...

  12. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    We discussed these lessons and challenges across two measurable characteristics namely quality of design (life cycle ... KEYWORDS: Software, Software Quality ,Quality Standard, Characteristics, Assessment, Challanges, lessons. 1. ... F. Bakpo, Department of Computer Science, University of Nigeria, Nsukka, Nigeria ...

  13. 40 CFR 799.2155 - Commercial hexane.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 31 2010-07-01 2010-07-01 true Commercial hexane. 799.2155 Section 799... Test Rules § 799.2155 Commercial hexane. (a) Identification of test substance. (1) “Commercial hexane...; CAS No. 96-37-7). ASTM D 1836, formally entitled “Standard Specification for Commercial Hexanes,” is...

  14. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  15. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  16. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  17. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  18. Software Reviews.

    Science.gov (United States)

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  19. Educational Software.

    Science.gov (United States)

    Northwest Regional Educational Lab., Portland, OR.

    The third session of IT@EDU98 consisted of five papers on educational software and was chaired by Tran Van Hao (University of Education, Ho Chi Minh City, Vietnam). "Courseware Engineering" (Nguyen Thanh Son, Ngo Ngoc Bao Tran, Quan Thanh Tho, Nguyen Hong Lam) briefly describes the use of courseware. "Machine Discovery Theorems in Geometry: A…

  20. Coordination Implications of Software Coupling in Open Source Projects

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Ågerfalk, Pär

    2010-01-01

    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial

  1. First International Workshop on Variability in Software Architecture (VARSA 2011)

    NARCIS (Netherlands)

    Galster, Matthias; Avgeriou, Paris; Weyns, Danny; Mannisto, Tomi

    2011-01-01

    Variability is the ability of a software artifact to be changed for a specific context. Mechanisms to accommodate variability include software product lines, configuration wizards and tools in commercial software, configuration interfaces of software components, or the dynamic runtime composition of

  2. DFI Computer Modeling Software (CMS)

    Energy Technology Data Exchange (ETDEWEB)

    Cazalet, E.G.; Deziel, L.B. Jr.; Haas, S.M.; Martin, T.W.; Nesbitt, D.M.; Phillips, R.L.

    1979-10-01

    The data base management system used to create, edit and store models data and solutions for the LEAP system is described. The software is entirely in FORTRAN-G for the IBM 370 series of computers and provides interface with a commercial data base system SYSTEM-2000.

  3. Astronomers as Software Developers

    Science.gov (United States)

    Pildis, Rachel A.

    2016-01-01

    Astronomers know that their research requires writing, adapting, and documenting computer software. Furthermore, they often have to learn new computer languages and figure out how existing programs work without much documentation or guidance and with extreme time pressure. These are all skills that can lead to a software development job, but recruiters and employers probably won't know that. I will discuss all the highly useful experience that astronomers may not know that they already have, and how to explain that knowledge to others when looking for non-academic software positions. I will also talk about some of the pitfalls I have run into while interviewing for jobs and working as a developer, and encourage you to embrace the curiosity employers might have about your non-standard background.

  4. User systems guidelines for software projects

    Energy Technology Data Exchange (ETDEWEB)

    Abrahamson, L. (ed.)

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  5. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  6. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  7. International Liability Issues for Software Quality

    National Research Council Canada - National Science Library

    Mead, Nancy

    2003-01-01

    This report focuses on international law related to cybercrime, international information security standards, and software liability issues as they relate to information security for critical infrastructure applications...

  8. Geoportale del Consorzio LaMMA Disseminazione di dati meteo in near real-time tramite standard OGC e software Open Source

    Directory of Open Access Journals (Sweden)

    Simone Giannechini

    2014-02-01

    Full Text Available This paper describes the spatial data infrastructure (SDI used by the LaMMA Consortium - Environmental Mod elling and Monitoring Laboratory for Sustainable Developm ent of Tuscany Region for sharing, viewing and cataloguing (metadata and related information all geospatial data that are daily proc essed and used op erationally in many meteorological and environmental app lications.The SDI was develop ed using Open Source technologies, mo reover the geospatial data has been imp lemented through protoco ls based on ogc (Open Geospatial Consortium standards such as WMS, WFS and CSW. Geoserver was used for disseminating geospatial data and maps through OGC WMS and WFS protoco ls while GeoNetwork was used as the cataloguing and search po rtal through also the CSW protocol; eventually MapStore was used to implement the mash-up front-end.The innovative aspect of this po rtal is the fact that it currently is ingesting, fusing and disseminating geospatial data related to the MetOcfield from various sources in near real-time in a comp rehensive manner that allows users to create add ed value visualizations for the support of operational use cases as well as to access and download underlying data (where app licable.

  9. Prospective observer and software-based assessment of magnetic resonance imaging quality in head and neck cancer: Should standard positioning and immobilization be required for radiation therapy applications?

    Science.gov (United States)

    Ding, Yao; Mohamed, Abdallah S R; Yang, Jinzhong; Colen, Rivka R; Frank, Steven J; Wang, Jihong; Wassal, Eslam Y; Wang, Wenjie; Kantor, Michael E; Balter, Peter A; Rosenthal, David I; Lai, Stephen Y; Hazle, John D; Fuller, Clifton D

    2015-01-01

    The purpose of this study was to investigate the potential of a head and neck magnetic resonance simulation and immobilization protocol on reducing motion-induced artifacts and improving positional variance for radiation therapy applications. Two groups (group 1, 17 patients; group 2, 14 patients) of patients with head and neck cancer were included under a prospective, institutional review board-approved protocol and signed informed consent. A 3.0-T magnetic resonance imaging (MRI) scanner was used for anatomic and dynamic contrast-enhanced acquisitions with standard diagnostic MRI setup for group 1 and radiation therapy immobilization devices for group 2 patients. The impact of magnetic resonance simulation/immobilization was evaluated qualitatively by 2 observers in terms of motion artifacts and positional reproducibility and quantitatively using 3-dimensional deformable registration to track intrascan maximum motion displacement of voxels inside 7 manually segmented regions of interest. The image quality of group 2 (29 examinations) was significantly better than that of group 1 (50 examinations) as rated by both observers in terms of motion minimization and imaging reproducibility (P quality of head and neck MRI in terms of motion-related artifacts and positional reproducibility was greatly improved by use of radiation therapy immobilization devices. Consequently, immobilization with external and intraoral fixation in MRI examinations is required for radiation therapy application. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  10. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  11. A Checklist to Evaluate Mapping Software.

    Science.gov (United States)

    Werner, Robert; Young, James

    1991-01-01

    Presents a checklist for evaluating commercially available mapping software. Analyzes software features by general categories including range of map types, availability and flexibility of data files, and program evaluation. Discusses ease of operation, the manual, tutorial, screens and help, error handling, design flexibility, hard copy output,…

  12. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  13. RANS Simulations using OpenFOAM Software

    Science.gov (United States)

    2016-01-01

    the potential to perform large scale CFD simulations without incurring the significant licence fees concomitant with using commercial software ...UNCLASSIFIED UNCLASSIFIED RANS Simulations using OpenFOAM Software D.A. Jones1, M. Chapuis3, M. Liefvendahl3, D. Norrison1, and R. Widjaja2...Agency - FOI, SE 147 25 Tumba, Stockholm, Sweden DST-Group-TR-3204 ABSTRACT The use of the OpenFOAM software suite for the performance of Reynolds

  14. DAMARIS – a flexible and open software platform for NMR spectrometer control

    OpenAIRE

    Gädke, Achim; Rosenstihl, Markus; Schmitt, Christopher; Stork, Holger; Nestle, Nikolaus

    2016-01-01

    Home-built NMR spectrometers with self-written control software have a long tradition in porous media research. Advantages of such spectrometers are not just lower costs but also more flexibility in developing new experiments (while commercial NMR systems are typically optimized for standard applications such as spectroscopy, imaging or quality control applications). Increasing complexity of computer operating systems, higher expectations with respect to user-friendliness and graphical use...

  15. RTSPM: real-time Linux control software for scanning probe microscopy.

    Science.gov (United States)

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  16. Commercial LANDSAT?

    Science.gov (United States)

    Private industry should assume responsibility either for the United States' land satellite (LANDSAT) system or for both the land and the weather satellite systems, recommends the Land Remote Sensing Satellite Advisory Committee. The committee (Eos, June 29, 1982, p. 553), composed of representatives from academia, industry, and government, has a working group that is evaluating the potential for commercialization of remote sensing satellites.The recommendations call for industry ownership or operation of either or both of the remote sensing systems, but only up to and including the holding of raw, unprocessed data. The National Aeronautics and Space Administration (NASA) currently operates LANDSAT but will be relinquishing its responsibility to the National Oceanic and Atmospheric Administration (NOAA) on January 31. NOAA already operates the U.S. civilian weather satellite service, which includes the NOAA-5, NOAA-6, and the Geostationary Operational Environmental (GOES) satellites (Eos, June 2, 1981, p. 522).

  17. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  18. Principled Construction of Software Safety Cases

    OpenAIRE

    Hawkins, Richard; Habli, Ibrahim; Kelly, Tim

    2013-01-01

    International audience; A small, manageable number of common software safety assurance principles can be observed from software assurance standards and industry best practice. We briefly describe these assurance principles and explain how they can be used as the basis for creating software safety arguments.

  19. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  1. Regional vegetation management standards for commercial ...

    African Journals Online (AJOL)

    The intensity of vegetation management required to produce significant growth benefits decreased with increasing altitude, as did the area that needed to be kept free from competing vegetation. In contrast ... Keywords: across-site comparisons; altitude; cost-benefit comparisons; site specific; weed control; weeding intensity

  2. Regional vegetation management standards for commercial pine ...

    African Journals Online (AJOL)

    Trial sites were selected across different physiographic regions such that a range of altitudinal, climatic and environmental gradients were represented. ... Two of the trials were situated at lower-altitude sites (900 m and 1 000 m above sea level [asl]), one at a mid-altitude site (1 267 m asl), and one at a higher-altitude site (1 ...

  3. An engineering context for software engineering

    OpenAIRE

    Riehle, Richard D.

    2008-01-01

    New engineering disciplines are emerging in the late Twentieth and early Twenty-first Century. One such emerging discipline is software engineering. The engineering community at large has long harbored a sense of skepticism about the validity of the term software engineering. During most of the fifty-plus years of software practice, that skepticism was probably justified. Professional education of software developers often fell short of the standard expected for conventional engineers; so...

  4. Commercial applications

    Science.gov (United States)

    The near term (one to five year) needs of domestic and foreign commercial suppliers of radiochemicals and radiopharmaceuticals for electromagnetically separated stable isotopes are assessed. Only isotopes purchased to make products for sale and profit are considered. Radiopharmaceuticals produced from enriched stable isotopes supplied by the Calutron facility at ORNL are used in about 600,000 medical procedures each year in the United States. A temporary or permanent disruption of the supply of stable isotopes to the domestic radiopharmaceutical industry could curtail, if not eliminate, the use of such diagnostic procedures as the thallium heart scan, the gallium cancer scan, the gallium abscess scan, and the low radiation dose thyroid scan. An alternative source of enriched stable isotopes exist in the USSR. Alternative starting materials could, in theory, eventually be developed for both the thallium and gallium scans. The development of a new technology for these purposes, however, would take at least five years and would be expensive. Hence, any disruption of the supply of enriched isotopes from ORNL and the resulting unavailability of critical nuclear medicine procedures would have a dramatic negative effect on the level of health care in the United States.

  5. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  6. Commercial lumber, round timbers, and ties

    Science.gov (United States)

    David E. Kretschmann

    2010-01-01

    When sawn, a log yields round timber, ties, or lumber of varying quality. This chapter presents a general discussion of grading, standards, and specifications for these commercial products. In a broad sense, commercial lumber is any lumber that is bought or sold in the normal channels of commerce. Commercial lumber may be found in a variety of forms, species, and types...

  7. Year 2000 commercial issues

    International Nuclear Information System (INIS)

    Kratz, M.P.J.; Booth, R.T.

    1998-01-01

    This presentation focused on commercial aspects of the Y2K including: (1) special communication issues, (2) outsourcing transactions, (3) joint ventures and the significance for the oil and gas industry, and (4) contingency planning. Communication issues involve interaction with suppliers and vendors of critical systems, liability for Y2K communications (misrepresentation, defamation, promissory estoppel, statutory liability), securities disclosure (Canadian and US SEC requirements), protected communications, protection for Year 2000 statements. Outsourcing problems highlighted include resistance of suppliers to assume responsibility for Y2K problem remediation, factors which support and negate supplier responsibility, scope of suppliers' obligation, and warranties in respect of third party software. Regarding joint ventures, questions concerning limitations on liability, supply warranties, stand-by arrangements, stockpiling inventory, indemnities, confidentiality, operator compensation versus operator risk, and insurance were raised and addressed. Among contingency planning issues the questions of Y2K legal audit, and disclosure aspects of contingency planning were the featured concerns. figs

  8. Balancing energy conservation and occupant needs in ventilation rate standards for Big Box stores and other commercial buildings in California. Issues related to the ASHRAE 62.1 Indoor Air Quality Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Apte, Mike G. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-31

    This report considers the question of whether the California Energy Commission should incorporate the ASHRAE 62.1 ventilation standard into the Title 24 ventilation rate (VR) standards, thus allowing buildings to follow the Indoor Air Quality Procedure. This, in contrast to the current prescriptive standard, allows the option of using ventilation rate as one of several strategies, which might include source reduction and air cleaning, to meet specified targets of indoor air concentrations and occupant acceptability. The research findings reviewed in this report suggest that a revised approach to a ventilation standard for commercial buildings is necessary, because the current prescriptive ASHRAE 62.1 Ventilation Rate Procedure (VRP) apparently does not provide occupants with either sufficiently acceptable or sufficiently healthprotective air quality. One possible solution would be a dramatic increase in the minimum ventilation rates (VRs) prescribed by a VRP. This solution, however, is not feasible for at least three reasons: the current need to reduce energy use rather than increase it further, the problem of polluted outdoor air in many cities, and the apparent limited ability of increasing VRs to reduce all indoor airborne contaminants of concern (per Hodgson (2003)). Any feasible solution is thus likely to include methods of pollutant reduction other than increased outdoor air ventilation; e.g., source reduction or air cleaning. The alternative 62.1 Indoor Air Quality Procedure (IAQP) offers multiple possible benefits in this direction over the VRP, but seems too limited by insufficient specifications and inadequate available data to provide adequate protection for occupants. Ventilation system designers rarely choose to use it, finding it too arbitrary and requiring use of much non-engineering judgment and information that is not readily available. This report suggests strategies to revise the current ASHRAE IAQP to reduce its current limitations. These

  9. National Software Reference Library (NSRL)

    Science.gov (United States)

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  10. Experiment to evaluate software safety

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-01-01

    The process of licensing nuclear power plants for operation consists of mandatory steps featuring detailed examination of the instrumentation and control system by the safety authorities, including softwares. The criticality of these softwares obliges the manufacturer to develop in accordance with the IEC 880 standard 'Computer software in nuclear power plant safety systems' issued by the International Electronic Commission. The evaluation approach, a two-stage assessment is described in detail. In this context, the IPSN (Institute of Protection and Nuclear Safety), the technical support body of the safety authority uses the MALPAS tool to analyse the quality of the programs. (R.P.). 4 refs

  11. Free software and open source databases

    Directory of Open Access Journals (Sweden)

    Napoleon Alexandru SIRITEANU

    2006-01-01

    Full Text Available The emergence of free/open source software -FS/OSS- enterprises seeks to push software development out of the academic stream into the commercial mainstream, and as a result, end-user applications such as open source database management systems (PostgreSQL, MySQL, Firebird are becoming more popular. Companies like Sybase, Oracle, Sun, IBM are increasingly implementing open source strategies and porting programs/applications into the Linux environment. Open source software is redefining the software industry in general and database development in particular.

  12. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  13. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  14. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  15. Commercialization of vein contrast enhancement

    Science.gov (United States)

    Lovhoiden, Gunnar; Deshmukh, Harshal; Vrancken, Carlos; Zhang, Yong; Zeman, Herbert D.; Weinberg, Devin

    2003-07-01

    An ongoing clinical study of an experimental infrared (IR) device, the Vein Contrast Enhancer (VCE) that visualizes surface veins for medical access, indicates that a commercial device with the performance of the existing VCE would have significant clinical utility for even a very skilled phlebotomist. A proof-of-principle prototype VCE device has now been designed and constructed that captures IR images of surface veins with a commercial CCD camera, transfers the images to a PC for real-time software image processing to enhance the vein contrast, and projects the enhanced images back onto the skin with a modified commercial LCD projector. The camera and projector are mounted on precision slides allowing for precise mechanical alignment of the two optical axes and for measuring the effects of axes misalignment. Precision alignment of the captured and projected images over the entire field-of-view is accomplished electronically by software adjustments of the translation, scaling, and rotation of the enhanced images before they are projected back onto the skin. This proof-of-principle prototype will be clinically tested and the experience gained will lead to the development of a commercial device, OnTarget!, that is compact, easy to use, and will visualize accessible veins in almost all subjects needing venipuncture.

  16. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  17. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  18. Open source software to control Bioflo bioreactors.

    Directory of Open Access Journals (Sweden)

    David A Burdge

    Full Text Available Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  19. Open Source Software to Control Bioflo Bioreactors

    Science.gov (United States)

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  20. Open source software to control Bioflo bioreactors.

    Science.gov (United States)

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  1. ScanImage: Flexible software for operating laser scanning microscopes

    Science.gov (United States)

    Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel

    2003-01-01

    Background Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. Results We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. Conclusions We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design. PMID:12801419

  2. 75 FR 17080 - Energy Conservation Standards for Walk-in Coolers and Walk-in Freezers: Public Meeting and...

    Science.gov (United States)

    2010-04-05

    ... energy savings (NES) and net present value (NPV) at various standard levels. There is one national impact...) a discount rate that reflects the real consumer cost of capital and puts the LCC in present-value... average values, and can be combined with Crystal Ball (a commercially available software program) to...

  3. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  4. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  5. Auditing Community Software Development

    Directory of Open Access Journals (Sweden)

    Mészáros Gergely

    2015-12-01

    Full Text Available In accordance with European efforts related to Critical Information Infrastructure Protection, in Hungary a special department called LRL-IBEK has been formed which is designated under the Disaster Management. While specific security issues of commercial applications are well understood and regulated by widely applied standards, increasing share of information systems are developed partly or entirely in a different way, by the community. In this paper different issues of the open development style will be discussed regarding the high requirements of Critical Information Infrastructures, and possible countermeasures will be suggested for the identified problems.

  6. Ubuntuism, commodification, and the software dialectic

    OpenAIRE

    Chege, Mike

    2008-01-01

    “Free as in speech, but not free as in beer,” is the refrain made famous by Richard Stallman, the standard-bearer of the free software movement. However, many free software advocates seem to be of the opinion that the purity of free software is somehow tainted by any preoccupation with money or profit. Inevitably, this has implications for the economic sustainability of free software, for without a source of income, how can free software hope to survive? The challenge of finding a way to ensu...

  7. TOGAF usage in outsourcing of software development

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2013-12-01

    Full Text Available TOGAF is an Enterprise Architecture framework that provides a method for developing Enterprise Architecture called architecture development method (ADM. The purpose of this paper is whether TOGAF ADM can be used for developing software application architecture. Because the software application architecture is one of the disciplines in application development life cycle, it is important to find out how the enterprise architecture development method can support the application architecture development. Having an open standard that can be used in the application architecture development could help in outsourcing of software development. If ADM could be used for software application architecture development, then we could consider its usability in outsourcing of software development.

  8. High-performance commercial building systems

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  9. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  10. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    Science.gov (United States)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  11. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    Science.gov (United States)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  12. Artificial intelligence approaches to software engineering

    Science.gov (United States)

    Johannes, James D.; Macdonald, James R.

    1988-01-01

    Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.

  13. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  14. 76 FR 28662 - Industrial, Commercial, and Institutional Boilers and Process Heaters and Commercial and...

    Science.gov (United States)

    2011-05-18

    ...; FRL-9308-6] RIN 2060-AQ25; 2060-AO12 Industrial, Commercial, and Institutional Boilers and Process Heaters and Commercial and Industrial Solid Waste Incineration Units AGENCY: Environmental Protection... Sources: Industrial, Commercial, and Institutional Boilers and Process Heaters'' and ``Standards of...

  15. 78 FR 11996 - Energy Efficiency Program for Commercial and Industrial Equipment: Commercial and Industrial Pumps

    Science.gov (United States)

    2013-02-21

    ...: Commercial and Industrial Pumps AGENCY: Office of Energy Efficiency and Renewable Energy, Department of... standards for commercial and industrial pumps published on February 1, 2013, is extended to May 2, 2013... relating to commercial and industrial pumps is extended to May 2, 2013. ADDRESSES: Any comments submitted...

  16. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    Science.gov (United States)

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  17. Building a virtual ligand screening pipeline using free software: a survey.

    Science.gov (United States)

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  18. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer... Standard for Software Unit Testing'' with the clarifications and exceptions stated in Section C, ``Staff...

  19. A within-trial cost-effectiveness analysis of primary care referral to a commercial provider for weight loss treatment, relative to standard care—an international randomised controlled trial

    Science.gov (United States)

    Fuller, N R; Colagiuri, S; Schofield, D; Olson, A D; Shrestha, R; Holzapfel, C; Wolfenstetter, S B; Holle, R; Ahern, A L; Hauner, H; Jebb, S A; Caterson, I D

    2013-01-01

    Background: Due to the high prevalence of overweight and obesity there is a need to identify cost-effective approaches for weight loss in primary care and community settings. Objective: We evaluated the cost effectiveness of two weight loss programmes of 1-year duration, either standard care (SC) as defined by national guidelines, or a commercial provider (Weight Watchers) (CP). Design: This analysis was based on a randomised controlled trial of 772 adults (87% female; age 47.4±12.9 years; body mass index 31.4±2.6 kg m−2) recruited by health professionals in primary care in Australia, United Kingdom and Germany. Both a health sector and societal perspective were adopted to calculate the cost per kilogram of weight loss and the ICER, expressed as the cost per quality adjusted life year (QALY). Results: The cost per kilogram of weight loss was USD122, 90 and 180 for the CP in Australia, the United Kingdom and Germany, respectively. For SC the cost was USD138, 151 and 133, respectively. From a health-sector perspective, the ICER for the CP relative to SC was USD18 266, 12 100 and 40 933 for Australia, the United Kingdom and Germany, respectively. Corresponding societal ICER figures were USD31 663, 24 996 and 51 571. Conclusion: The CP was a cost-effective approach from a health funder and societal perspective. Despite participants in the CP group attending two to three times more meetings than the SC group, the CP was still cost effective even including these added patient travel costs. This study indicates that it is cost effective for general practitioners (GPs) to refer overweight and obese patients to a CP, which may be better value than expending public funds on GP visits to manage this problem. PMID:22929209

  20. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  1. Controlling Software Piracy.

    Science.gov (United States)

    King, Albert S.

    1992-01-01

    Explains what software manufacturers are doing to combat software piracy, recommends how managers should deal with this problem, and provides a role-playing exercise to help students understand the issues in software piracy. (SR)

  2. Bioactivity and laundering resistance of five commercially available, factory-treated permethrin-impregnated fabrics for the prevention of mosquito-borne diseases: the need for a standardized testing and licensing procedure.

    Science.gov (United States)

    Faulde, Michael K; Pages, Frederic; Uedelhoven, Waltraud

    2016-04-01

    Personal protective measures against hematophagous vectors constitute the first line of defense against arthropod-borne diseases. In this regard, a major advance has been the development of residual insecticides that can be impregnated into clothing. Currently, however, information on specific treatment procedures, initial insecticide concentrations, arthropod toxicity, residual activity, and laundering resistance is either fragmentary or non-existent, and no World Health Organization Pesticides Evaluation Scheme or other guidelines exist for the standardized testing and licensing of insecticide-treated clothing. The aim of this study was to analyze the insecticide content, contact toxicity, laundering resistance, and residual activity of five commercially available and commonly used permethrin-treated fabrics-Insect Shield, ExOfficio, Sol's Monarch T-shirts, battle dress uniforms (BDUs), and Labonal socks-against vector-competent Aedes aegypti, Anopheles stephensi, and Culex pipiens mosquitoes under laboratory conditions. Prior to laundering, permethrin concentrations ranged from 4300 to 870 mg/m(2) whereas, after 100 defined machine launderings, the remaining permethrin content fell to between 1800 and 20 mg/m(2), a percentage permethrin loss of 58.1 to 98.5 %. The highest 99 % knockdown (KD99) efficacy of permethrin was detected in Ae. aegypti, followed by An. stephensi and Cx. pipiens demonstrating that Ae. aegypti is the most sensitive species and Cx. pipiens the least sensitive. After 100 launderings, the remaining biocidal efficacy differed markedly among the five brands, with KD99 times varying from 38.8 ± 2.9 to >360 min for Ae. aegypti, from 44 ± 3.5 to >360 min for An. stephensi, and from 98 ± 10.6 to >360 min for Cx. pipiens. Overall, the ranking of the residual biocidal efficacies within the five brands tested was as follows: BDU ≈ Labonal > Sol's Monarch > ExOfficio > Insect Shield. When applying German Armed Forces

  3. Building Energy Management Open Source Software

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Saifur [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States)

    2017-08-25

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they are not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.

  4. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  5. Software Metrics: Measuring Haskell

    OpenAIRE

    Ryder, Chris; Thompson, Simon

    2005-01-01

    Software metrics have been used in software engineering as a mechanism for assessing code quality and for targeting software development activities, such as testing or refactoring, at areas of a program that will most benefit from them. Haskell has many tools for software engineering, such as testing, debugging and refactoring tools, but software metrics have mostly been neglected. The work presented in this paper identifies a collection of software metrics for use with Haskell programs. Thes...

  6. Software systems as cities

    OpenAIRE

    Wettel, Richard; Lanza, Michele

    2010-01-01

    Software understanding takes up a large share of the total cost of a software system. The high costs attributed to software understanding activities are caused by the size and complexity of software systems, by the continuous evolution that these systems are subject to, and by the lack of physical presence which makes software intangible. Reverse engineering helps practitioners deal with the intrinsic complexity of software, by providing a broad range of patterns and techniques. One of...

  7. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  8. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Recommend that DoN create a software acquisition specialty, mandate basic schooling for software acquisition specialists, close certain acquisition loopholes that permit poor development practices...

  9. Software Release Management

    National Research Council Canada - National Science Library

    Hoek, Andre van der; Hall, Richard S; Heimbigner, Dennis; Wolf, Alexander L

    1996-01-01

    .... Both developers and users of such software are affected by these complications. Developers need to accurately document complex and changing dependencies among the systems constituting the software...

  10. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  11. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  12. Speech to Text Software Evaluation Report

    CERN Document Server

    Martins Santo, Ana Luisa

    2017-01-01

    This document compares out-of-box performance of three commercially available speech recognition software: Vocapia VoxSigma TM , Google Cloud Speech, and Lime- craft Transcriber. It is defined a set of evaluation criteria and test methods for speech recognition softwares. The evaluation of these softwares in noisy environments are also included for the testing purposes. Recognition accuracy was compared using noisy environments and languages. Testing in ”ideal” non-noisy environment of a quiet room has been also performed for comparison.

  13. Reflections on Courses for Software Language Engineering

    NARCIS (Netherlands)

    Bagge, A.H.; Lämmel, R.; Zaytsev, V.; Demuth, B.; Stikkolorum, D.

    2014-01-01

    Software Language Engineering (SLE) has emerged as a field in computer science research and software engineering, but it has yet to become entrenched as part of the standard curriculum at universities. Many places have a compiler construction (CC) course and a programming languages (PL) course, but

  14. TTCN-3 for Distributed Testing Embedded Software

    NARCIS (Netherlands)

    Blom, Stefan; Deiß, Thomas; Ioustinova, Natalia; Kontio, Ari; van de Pol, Jan Cornelis; Rennoch, Axel; Sidorova, Natalia; Virbitskaite, I.; Voronkov, A.

    TTCN-3 is a standardized language for specifying and executing test suites that is particularly popular for testing embedded systems. Prior to testing embedded software in a target environment, the software is usually tested in the host environment. Executing in the host environment often affects

  15. ARROW (Version 2) Commercial Software Validation and Configuration Control

    International Nuclear Information System (INIS)

    HEARD, F.J.

    2000-01-01

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington

  16. Commercial Off-The-Shelf (COTS) Avionics Software Study

    National Research Council Canada - National Science Library

    Krodel, Jim

    2001-01-01

    .... The motivation is even a bit beyond monetary resources as the scarcity of highly trained personnel that can develop such systems has also provided fuel to the attractiveness of considering reuse...

  17. ARROW (Version 2) Commercial Software Validation and Configuration Control

    Energy Technology Data Exchange (ETDEWEB)

    HEARD, F.J.

    2000-02-10

    ARROW (Version 2), a compressible flow piping network modeling and analysis computer program from Applied Flow Technology, was installed for use at the U.S. Department of Energy Hanford Site near Richland, Washington.

  18. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  19. 7 CFR 51.2278 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.2278 Section 51.2278 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Standards for Shelled English Walnuts (Juglans Regia) Grades § 51.2278 U.S. Commercial. “U.S. Commercial...

  20. Commercial microwave space power

    International Nuclear Information System (INIS)

    Siambis, J.; Gregorwich, W.; Walmsley, S.; Shockey, K.; Chang, K.

    1991-01-01

    This paper reports on central commercial space power, generating power via large scale solar arrays, and distributing power to satellites via docking, tethering or beamed power such as microwave or laser beams, that is being investigated as a potentially advantageous alternative to present day technology where each satellite carries its own power generating capability. The cost, size and weight for electrical power service, together with overall mission requirements and flexibility are the principal selection criteria, with the case of standard solar array panels based on the satellite, as the reference point. This paper presents and investigates a current technology design point for beamed microwave commercial space power. The design point requires that 25 kW be delivered to the user load with 30% overall system efficiency. The key elements of the design point are: An efficient rectenna at the user end; a high gain, low beam width, efficient antenna at the central space power station end, a reliable and efficient cw microwave tube. Design trades to optimize the proposed near term design point and to explore characteristics of future systems were performed. Future development for making the beamed microwave space power approach more competitive against docking and tethering are discussed

  1. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  2. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  3. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  4. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  5. Technical Support Document for Version 3.4.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2007-09-14

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  6. Technical Support Document for Version 3.9.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, R. G.; Richman, Eric E.; Schultz, Ralph W.; Winiarski, David W.

    2011-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.

  7. Technical Support Document for Version 3.9.1 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.

  8. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  9. Simulation Testing of Embedded Flight Software

    Science.gov (United States)

    Shahabuddin, Mohammad; Reinholtz, William

    2004-01-01

    Virtual Real Time (VRT) is a computer program for testing embedded flight software by computational simulation in a workstation, in contradistinction to testing it in its target central processing unit (CPU). The disadvantages of testing in the target CPU include the need for an expensive test bed, the necessity for testers and programmers to take turns using the test bed, and the lack of software tools for debugging in a real-time environment. By virtue of its architecture, most of the flight software of the type in question is amenable to development and testing on workstations, for which there is an abundance of commercially available debugging and analysis software tools. Unfortunately, the timing of a workstation differs from that of a target CPU in a test bed. VRT, in conjunction with closed-loop simulation software, provides a capability for executing embedded flight software on a workstation in a close-to-real-time environment. A scale factor is used to convert between execution time in VRT on a workstation and execution on a target CPU. VRT includes high-resolution operating- system timers that enable the synchronization of flight software with simulation software and ground software, all running on different workstations.

  10. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out......%). Beyond these specific topics, the study results also show an increasing interest into secondary studies with the purpose of aggregating and structuring SPI-related knowledge. Finally, the present study helps directing future research by identifying under-researched topics awaiting further investigation....... there? Are there new trends and emerging approaches? What are open issues? Still, we struggle to answer these questions about the current state of SPI and related research. In this article, we present results from an updated systematic mapping study to shed light on the field of SPI, to develop a big...

  11. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    or more specified systems. Figure 5: Quality Characteristics [ ISO /IEC 9126 -1, 1996] Despite the lack of prior study and classification of IR issues...Company. [ ISO /IEC 9126 -1, 1996] Information Technology - Software quality characteristics and metrics - Part 1: Quality characteristics and sub...characteristics, Standard (No. ISO /IEC 9126 -1). [ ISO /IEC 12207, 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207

  12. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  13. Commercial Skills Test Information Management System final report and self-sustainability plan : [technology brief].

    Science.gov (United States)

    2014-04-01

    The Commercial Skills Test Information Management System (CSTIMS) was developed to address the fraudulent issuance of commercial drivers licenses (CDLs) across the United States. CSTIMS was developed as a Web-based, software-as-a-service system to...

  14. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  15. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  16. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  17. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  18. Optical Software to Calculate Terrestrial Planet Finder Contrast Including Polarization Effects, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — BRO will provide commercially available optics software that dependably calculates image plane irradiance to the precision required by TPF missions. Calculations...

  19. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  20. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    ...) of Senate Report 106-50, is given for reference in Table 1-1 of the body of this report. This paper recommends a set of software acquisition and software engineering best practices that addresses the issues raised in the Senate Report...

  1. Amalgamation of Personal Software Process in Software ...

    African Journals Online (AJOL)

    Today, concern for quality has become an international movement. Even though most industrial organizations have now adopted modern quality principles, the software community has continued to rely on testing as the principal quality management method. Different decades have different trends in software engineering.

  2. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  3. Design and Implementation of a Mobile Phone Locator Using Software Defined Radio

    National Research Council Canada - National Science Library

    Larsen, Ian P

    2007-01-01

    ...) signal using software defined radio and commodity computer hardware. Using software designed by the GNU free software project as a base, standard GSM packets were transmitted and received over the air, and their arrival times detected...

  4. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  5. Commercial Buildings Characteristics, 1992

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-29

    Commercial Buildings Characteristics 1992 presents statistics about the number, type, and size of commercial buildings in the United States as well as their energy-related characteristics. These data are collected in the Commercial Buildings Energy Consumption Survey (CBECS), a national survey of buildings in the commercial sector. The 1992 CBECS is the fifth in a series conducted since 1979 by the Energy Information Administration. Approximately 6,600 commercial buildings were surveyed, representing the characteristics and energy consumption of 4.8 million commercial buildings and 67.9 billion square feet of commercial floorspace nationwide. Overall, the amount of commercial floorspace in the United States increased an average of 2.4 percent annually between 1989 and 1992, while the number of commercial buildings increased an average of 2.0 percent annually.

  6. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  7. Software agents for the dissemination of remote terrestrial sensing data

    Science.gov (United States)

    Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.

    1994-01-01

    Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation

  8. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  9. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  10. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  11. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  12. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    Science.gov (United States)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  13. Development of Radio Frequency Antenna Radiation Simulation Software

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Rozaimah Abd Rahim; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2014-01-01

    Antennas are widely used national wide for radio frequency propagation especially for communication system. Radio frequency is electromagnetic spectrum from 10 kHz to 300 GHz and non-ionizing. These radiation exposures to human being have radiation hazard risk. This software was under development using LabVIEW for radio frequency exposure calculation. For the first phase of this development, software purposely to calculate possible maximum exposure for quick base station assessment, using prediction methods. This software also can be used for educational purpose. Some results of this software are comparing with commercial IXUS and free ware NEC software. (author)

  14. 7 CFR 51.1435 - U.S. Commercial Pieces.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Pieces. 51.1435 Section 51.1435 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1435 U.S. Commercial Pieces. The...

  15. 7 CFR 51.1433 - U.S. Commercial Halves.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Halves. 51.1433 Section 51.1433 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1433 U.S. Commercial Halves. The...

  16. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  17. A NEW EXHAUST VENTILATION SYSTEM DESIGN SOFTWARE

    Directory of Open Access Journals (Sweden)

    H. Asilian Mahabady

    2007-09-01

    Full Text Available A Microsoft Windows based ventilation software package is developed to reduce time-consuming and boring procedure of exhaust ventilation system design. This program Assure accurate and reliable air pollution control related calculations. Herein, package is tentatively named Exhaust Ventilation Design Software which is developed in VB6 programming environment. Most important features of Exhaust Ventilation Design Software that are ignored in formerly developed packages are Collector design and fan dimension data calculations. Automatic system balance is another feature of this package. Exhaust Ventilation Design Software algorithm for design is based on two methods: Balance by design (Static pressure balance and design by Blast gate. The most important section of software is a spreadsheet that is designed based on American Conference of Governmental Industrial Hygienists calculation sheets. Exhaust Ventilation Design Software is developed so that engineers familiar with American Conference of Governmental Industrial Hygienists datasheet can easily employ it for ventilation systems design. Other sections include Collector design section (settling chamber, cyclone, and packed tower, fan geometry and dimension data section, a unit converter section (that helps engineers to deal with units, a hood design section and a Persian HTML help. Psychometric correction is also considered in Exhaust Ventilation Design Software. In Exhaust Ventilation Design Software design process, efforts are focused on improving GUI (graphical user interface and use of programming standards in software design. Reliability of software has been evaluated and results show acceptable accuracy.

  18. Software for Data Acquisition AMC Module with PCI Express Interface

    CERN Document Server

    Szachowalow, S; Makowski, D; Butkowski, L

    2010-01-01

    Free Electron Laser in Hamburg (FLASH) and XRay Free Electron Laser (XFEL) are linear accelerators that require a complex and accurate Low Level Radio Frequency (LLRF) control system. Currently working systems are based on aged Versa Module Eurocard (VME) architecture. One of the alternatives for the VME bus is the Advanced Telecommunications and Computing Architecture (ATCA) standard. The ATCA based LLRF controller mainly consists of a few ATCA carrier boards and several Advanced Mezzanine Cards (AMC). AMC modules are available in variety of functions such as: ADC, DAC, data storage, data links and even CPU cards. This paper focuses on the software that allows user to collect and plot the data from commercially available TAMC900 board.

  19. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  20. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  1. Improving Software Reliability Forecasting

    NARCIS (Netherlands)

    Burtsy, Bernard; Albeanu, Grigore; Boros, Dragos N.; Popentiu, Florin; Nicola, V.F.

    1996-01-01

    This work investigates some methods for software reliability forecasting. A supermodel is presented as a suited tool for prediction of reliability in software project development. Also, times series forecasting for cumulative interfailure time is proposed and illustrated.

  2. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  3. Paladin Software Support Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The Paladin Software Support Environment (SSE) occupies 2,241 square-feet. It contains the hardware and software tools required to support the Paladin Automatic Fire...

  4. Software service history report

    Science.gov (United States)

    2002-01-01

    The safe and reliable operation of software within civil aviation systems and equipment has historically been assured through the application of rigorous design assurance applied during the software development process. Increasingly, manufacturers ar...

  5. TicTimer software for measuring tic suppression [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Jonathan K. Black

    2017-12-01

    Full Text Available Woods and Himle developed a standardized tic suppression paradigm (TSP for the experimental setting, to quantify the effects of intentional tic suppression in Tourette syndrome. The present article describes a Java program that automates record keeping and reward dispensing during the several experimental conditions of the TSP. The software can optionally be connected to a commercial reward token dispenser to further automate reward delivery to the participant. The timing of all tics, 10-second tic-free intervals, and dispensed rewards is recorded in plain text files for later analysis. Expected applications include research on Tourette syndrome and related disorders.

  6. An Introduction to Flight Software Development: FSW Today, FSW 2010

    Science.gov (United States)

    Gouvela, John

    2004-01-01

    Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by

  7. Software engineering measurement

    CERN Document Server

    Munson, PhD, John C

    2003-01-01

    By demonstrating how to develop simple experiments for the empirical validation of theoretical research and showing how to convert measurement data into meaningful and valuable information, this text fosters more precise use of software measurement in the computer science and software engineering literature. Software Engineering Measurement shows you how to convert your measurement data to valuable information that can be used immediately for software process improvement.

  8. Agent Building Software

    Science.gov (United States)

    2000-01-01

    AgentBuilder is a software component developed under an SBIR contract between Reticular Systems, Inc., and Goddard Space Flight Center. AgentBuilder allows software developers without experience in intelligent agent technologies to easily build software applications using intelligent agents. Agents are components of software that will perform tasks automatically, with no intervention or command from a user. AgentBuilder reduces the time and cost of developing agent systems and provides a simple mechanism for implementing high-performance agent systems.

  9. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  10. Software quality challenges.

    OpenAIRE

    Fitzpatrick, Ronan; Smith, Peter; O'Shea, Brendan

    2004-01-01

    This paper sets out a number of challenges facing the software quality community. These challenges relate to the broader view of quality and the consequences for software quality definitions. These definitions are related to eight perspectives of software quality in an end-to-end product life cycle. Research and study of software quality has traditionally focused on product quality for management information systems and this paper considers the challenge of defining additional quality factors...

  11. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  12. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  13. On commercial media bias

    OpenAIRE

    Germano, Fabrizio

    2008-01-01

    Within the spokes model of Chen and Riordan (2007) that allows for non-localized competition among arbitrary numbers of media outlets, we quantify the effect of concentration of ownership on quality and bias of media content. A main result shows that too few commercial outlets, or better, too few separate owners of commercial outlets can lead to substantial bias in equilibrium. Increasing the number of outlets (commercial and non-commercial) tends to bring down this bias; but the strongest ef...

  14. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  15. Prototype Software Assurance Framework (SAF): Introduction and Overview

    Science.gov (United States)

    2017-04-05

    weaknesses in managing cybersecurity risk cannot be overstated. Our field experience indicates that few acquisition and development programs currently imple...should provide. The purpose of the Requirements area of the SAF is to produce, analyze, and manage security requirements for the customer , prod- uct...including custom -developed software, commercial-off-the-shelf software, and open source software) to establish their criticality. Risk Management Plan

  16. Software variability management

    NARCIS (Netherlands)

    Bosch, J; Nord, RL

    2004-01-01

    During recent years, the amount of variability that has to be supported by a software artefact is growing considerably and its management is evolving into a major challenge during development, usage, and evolution of software artefacts. Successful management of variability in software leads to

  17. Software Language Evolution

    NARCIS (Netherlands)

    Vermolen, S.D.

    2012-01-01

    Software plays a critical role in our daily life. Vast amounts of money are spent on more and more complex systems. All software, regardless if it controls a plane or the game on your phone is never finished. Software changes when it contains bugs or when new functionality is added. This process of

  18. Software Engineering for Portability.

    Science.gov (United States)

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  19. Astronomical Software Directory Service

    Science.gov (United States)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  20. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  1. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  2. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  3. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  4. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  5. Cost Optimization Through Open Source Software

    Directory of Open Access Journals (Sweden)

    Mark VonFange

    2010-12-01

    Full Text Available The cost of information technology (IT as a percentage of overall operating and capital expenditures is growing as companies modernize their operations and as IT becomes an increasingly indispensable part of company resources. The price tag associated with IT infrastructure is a heavy one, and, in today's economy, companies need to look for ways to reduce overhead while maintaining quality operations and staying current with technology. With its advancements in availability, usability, functionality, choice, and power, free/libre open source software (F/LOSS provides a cost-effective means for the modern enterprise to streamline its operations. iXsystems wanted to quantify the benefits associated with the use of open source software at their company headquarters. This article is the outgrowth of our internal analysis of using open source software instead of commercial software in all aspects of company operations.

  6. Global Software Development : - Software Architecture - Organization - Communication

    OpenAIRE

    Førde, Dan Sørensen

    2003-01-01

    Our globalized world has an impact on almost any area of our lives. The globalization affecting the business running around the globe, and forces employees and managers to think of new ways of doing their business. Globalization in the software development industry increased through the 1990s and is still increasing. The Internet makes the collaboration possible and the developers do not need to be co-located to work together on a common software development project. The ...

  7. Analysis of Schedule Determination in Software Program Development and Software Development Estimation Models

    Science.gov (United States)

    1988-09-01

    Performed . . . 52 Software Development Standards . . . . . 53 Use of Management Principles . . . . . . 54 Software Programaer Ability...Manager Program Flow and Test Case Analyzer File Manager 107 Mitre Prolect Data (page 4 of 18) Project *24 ( ABC ,D) 24 A B C D Description of Factors

  8. Commercialization in Innovation Management

    DEFF Research Database (Denmark)

    Sløk-Madsen, Stefan Kirkegaard; Ritter, Thomas; Sornn-Friese, Henrik

    For any firm, the ultimate purpose of new product development is the commercialization of the new offerings. Despite its regular use in the product innovation and general management science literature, commercialization is only loosely defined and applied. This lack of conceptual clarity about...... definitions and interpretations of commercialization. We offer a process-oriented definition of commercialization that is theoretically founded in the capability-based view of the firm. We also outline an agenda for future theoretical development and empirical research on commercialization aimed at advancing...

  9. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  10. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL) ......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research.......  This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL...

  11. Deployment of Open Standards in the Public Administration

    Directory of Open Access Journals (Sweden)

    Dorin IRIMESCU

    2008-01-01

    Full Text Available Open Source Software is receiving an increasing attention in the public administration. The aim of the paper is to discuss the deployment of open source software for office automation and to present a synthesis of the up to date status. It is intended to sensitize provosts and policy makers regarding the value and benefits of open standards in public administration. The article explains why anyone would choose an open standard format for office documents, instead of the obsolete binary formats. The responsibility of the public sector to protect the permanently, open and free access to public documents is emphasized. Switching the IT systems to open source and open standards can solve the problems with significant financial benefits. One available open source software solution in the field, the Open Office suite, is presented as a viable and free alternative to commercial products. The article reviews next the existent competing open standards - OpenDocument and OpenXML. Finally, the measures and efforts implied to make a non invasive migration to open technologies are presented.

  12. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  13. An evaluation and acceptance of COTS software for FPGA-based controllers in NPPS

    International Nuclear Information System (INIS)

    Jung, Sejin; Kim, Eui-Sub; Yoo, Junbeom; Kim, Jang-Yeol; Choi, Jong Gyun

    2016-01-01

    Highlights: • All direct/indirect COTS SW should be dedicated. • FPGA synthesis tools are important for the safety of new digital I&Cs. • No standards/reports are yet available to deal with the indirect SW – FPGA synthesis tools. • This paper proposes a new evaluation/acceptance process and criteria for indirect SW. - Abstract: FPGA (Field-Programmable Gate Array) has received much attention from nuclear industry as an alternative platform of PLC (Programmable Logic Controller)-based digital I&C (Instrumentation & Control). Software aspect of FPGA development encompasses several commercial tools such as logic synthesis and P&R (Place & Route), which should be first dedicated in accordance with domestic standards based on EPRI NP-5652. Even if a state-of-the-art supplementary EPRI TR-1025243 makes an effort, the dedication of indirect COTS (Commercial Off-The-Shelf) SW such as FPGA logic synthesis tools has still caused a dispute. This paper proposes an acceptance process and evaluation criteria, specific to COTS SW, not commercial-grade direct items. It specifically incorporates indirect COTS SW and also provides categorized evaluation criteria for acceptance. It provides an explicit linkage between acceptance methods (Verification and Validation techniques) and evaluation criteria, too. We tried to perform the evaluation and acceptance process upon a commercial FPGA logic synthesis tool being used to develop a new FPGA-based digital I&C in Korea, and could confirm its applicability.

  14. DEVELOPING EVALUATION INSTRUMENT FOR MATHEMATICS EDUCATIONAL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Wahyu Setyaningrum

    2012-02-01

    Full Text Available The rapid increase and availability of mathematics software, either for classroom or individual learning activities, presents a challenge for teachers. It has been argued that many products are limited in quality. Some of the more commonly used software products have been criticized for poor content, activities which fail to address some learning issues, poor graphics presentation, inadequate documentation, and other technical problems. The challenge for schools is to ensure that the educational software used in classrooms is appropriate and effective in supporting intended outcomes and goals. This paper aimed to develop instrument for evaluating mathematics educational software in order to help teachers in selecting the appropriate software. The instrument considers the notion of educational including content, teaching and learning skill, interaction, and feedback and error correction; and technical aspects of educational software including design, clarity, assessment and documentation, cost and hardware and software interdependence. The instrument use a checklist approach, the easier and effective methods in assessing the quality of educational software, thus the user needs to put tick in each criteria. The criteria in this instrument are adapted and extended from standard evaluation instrument in several references.   Keywords: mathematics educational software, educational aspect, technical aspect.

  15. Evaluating Software Complexity Based on Decision Coverage

    Directory of Open Access Journals (Sweden)

    Mustafa AL-HAJJAJI

    2012-01-01

    Full Text Available It is becoming increasingly difficult to ignore the complexity of software products. Software metrics are proposed to help show indications for quality, size, complexity, etc. of software products. In this paper, software metrics related to complexity are developed and evaluated. A dataset of many open source projects is built to assess the value of the developed metrics. Comparisons and correlations are conducted among the different tested projects. A classifica-tion is proposed to classify software code into different levels of complexity. The results showed that measuring the complexity of software products based on decision coverage gives a significant indicator of degree of complexity of those software products. However, such in-dicator is not exclusive as there are many other complexity indicators that can be measured in software products. In addition, we conducted a comparison among several available metric tools that can collect software complexity metrics. Results among those different tools were not consistent. Such comparison shows the need to have a unified standard for measuring and collecting complexity attributes.

  16. Business engineering. Generic Software Architecture in an Object Oriented View

    Directory of Open Access Journals (Sweden)

    Mihaela MURESAN

    2006-01-01

    Full Text Available The generic software architecture offers a solution for the the information system's development and implementation. A generic software/non-software model could be developed by integrating the enterprise blueprint concept (Zachman and the object oriented paradigm (Coad's archetype concept. The standardization of the generic software architecture for various specific software components could be a direction of crucial importance, offering the guarantee of the quality of the model and increasing the efficiency of the design, development and implementation of the software. This approach is also useful for the implementation of the ERP systems designed to fit the user’s particular requirements.

  17. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  18. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  19. Metrics. [measurement for effective software development and management

    Science.gov (United States)

    Mcgarry, Frank

    1991-01-01

    A development status evaluation is presented for practical software performance measurement, or 'metrics', in which major innovations have recently occurred. Metrics address such aspects of software performance as whether a software project is on schedule, how many errors can be expected from it, whether the methodology being used is effective and the relative quality of the software employed. Metrics may be characterized as explicit, analytical, and subjective. Attention is given to the bases for standards and the conduct of metrics research.

  20. Comparison of Software Quality Metrics for Object-Oriented System

    OpenAIRE

    Amit Sharma; Sanjay Kumar Dubey

    2012-01-01

    According to the IEEE standard glossary of softwareengineering, Object-Oriented design is becoming moreimportant in software development environment andsoftware Metrics are essential in software engineering formeasuring the software complexity, estimating size, qualityand project efforts. There are various approaches throughwhich we can find the software cost estimation andpredicates on various kinds of deliverable items. The toolsare used for measuring the estimations are lines of codes,func...