WorldWideScience

Sample records for standard commercial software

  1. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    .../contractor proposes its standard commercial software license, those applicable portions thereof consistent... its standard commercial software license until after this purchase order/contract has been issued, or at or after the time the computer software is delivered, such license shall nevertheless be deemed...

  2. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  3. Integrating commercial software in accelerator control- case study

    International Nuclear Information System (INIS)

    Pace, Alberto

    1994-01-01

    Using existing commercial software is the dream of any control system engineer for the development cost reduction that can reach one order of magnitude. This dream often vanishes when appears the requirement to have a uniform and consistent architecture through a wide number of components and applications. This makes it difficult to integrate several commercial packages that often impose different user interface and communication standards. This paper will describe the approach and standards that have been chosen for the CERN ISOLDE control system that have allowed several commercial packages to be integrated in the system as-they-are permitting the software development cost to be reduced to a minimum. (author). 10 refs., 2 tabs., 9 figs

  4. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  5. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  6. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  7. Computer-assisted operational management of power plants in the field of tension between standard and individual software; IT-unterstuetzte Betriebsfuehrung von Kraftwerken. Im Spannungsfeld von Standard- und Individual-Software

    Energy Technology Data Exchange (ETDEWEB)

    Hippmann, Norbert [RWE Power AG, Essen (Germany). Sparte Steinkohle-/Gas-Kraftwerke

    2010-07-01

    Process routines in the operational management of power plants - particularly maintenance - are now largely planned, controlled and documented with the help of IT. Depending on corporate policy, IT support for routines is currently realised either with commercially available standard ERP software or with dedicated applications that have been specially developed for a given company. Whereas standard software has certain technical benefits (homogeneous databases, data integrity, standard user interface, no software interfaces, standard maintenance and service), customised applications have the undisputed advantage of offering the best possible mapping of company-specific process routines. By exploiting the full spectrum of IT enhancement options of its SAP system, RWE Power has largely combined the respective benefits of both standard and customised software, while also realising high-end user requirements that go beyond the mere standard. (orig.)

  8. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  9. 48 CFR 52.227-19 - Commercial Computer Software License.

    Science.gov (United States)

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  10. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    Elphick, J.; Cope, H.

    1992-01-01

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  11. Prediction of ice accretion and anti-icing heating power on wind turbine blades using standard commercial software

    International Nuclear Information System (INIS)

    Villalpando, Fernando; Reggio, Marcelo; Ilinca, Adrian

    2016-01-01

    An approach to numerically simulate ice accretion on 2D sections of a wind turbine blade is presented. The method uses standard commercial ANSYS-Fluent and Matlab tools. The Euler-Euler formulation is used to calculate the water impingement on the airfoil, and a UDF (Used Defined Function) has been devised to turn the airfoil's solid wall into a permeable boundary. Mayer's thermodynamic model is implemented in Matlab for computing ice thickness and for updating the airfoil contour. A journal file is executed to systematize the procedure: meshing, droplet trajectory calculation, thermodynamic model application for computing ice accretion, and the updating of airfoil contours. The proposed ice prediction strategy has been validated using iced airfoil contours obtained experimentally in the AMIL refrigerated wind tunnel (Anti-icing Materials International Laboratory). Finally, a numerical prediction method has been generated for anti-icing assessment, and its results compared with data obtained in this laboratory. - Highlights: • A methodology for ice accretion prediction using commercial software is proposed. • Euler model gives better prediction of airfoil water collection with detached flow. • A source term is used to change from a solid wall to a permeable wall in Fluent. • Energy needed for ice-accretion mitigation system is predicted.

  12. Improvement of gamma calibration procedures with commercial management software

    International Nuclear Information System (INIS)

    Lucena, Rodrigo F.; Potiens, Maria da Penha A.; Santos, Gelson P.; Vivolo, Vitor

    2007-01-01

    In this work, the gamma calibration procedure of the Instruments Calibration Laboratory (LCI) of the IPEN-CNEN-SP was improved with the use of the commercial management software Autolab TM from Automa Company. That software was adapted for our specific use in the calibration procedures. The evaluation of the uncertainties in gamma calibration protocol was improved by the LCI staff and yet the all worksheets and final calibration report lay-out was developed in commercial software like Excell TM and Word TM from Microsft TM . (author)

  13. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  14. Software Quality Assurance and Controls Standard

    Science.gov (United States)

    2010-04-27

    dassurance a wor pro uc s an processes comply with predefined provisions and plans. • According to International Standard (IS) 12207 – of the 44...from document (plan) focus to process focus – Alignment with framework standard IS 12207 software life cycle (SLC) processes with exact...Books and P blications IEEE Software and Systems Engineering curriculum ABET u Certified Software Development Professional Standards ISO /IEC

  15. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  16. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  17. Dilemmas within Commercial Involvement in Open Source Software

    DEFF Research Database (Denmark)

    Ciesielska, Malgorzata; Westenholz, Ann

    2016-01-01

    to free-riding. There are six levels of commercial involvement in open source communities, and each of them is characterized by a different dilemma. Originality/value – The paper sheds light on the various level of involvement of business in open source movement and emphasize that the popularized “open......Purpose – The purpose of this paper is to contribute to the literature about the commercial involvement in open source software, levels of this involvement and consequences of attempting to mix various logics of action. Design/methodology/approach – This paper uses the case study approach based...... on mixed methods: literature reviews and news searches, electronic surveys, qualitative interviews and observations. It combines discussions from several research projects as well as previous publications to present the scope of commercial choices within open source software and their consequences...

  18. 78 FR 17875 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2013-03-25

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB59 Commercial Driver's License Testing and Commercial Learner's.... The 2011 final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial learner's permit...

  19. 77 FR 26989 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-08

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's... effective on July 8, 2011. That final rule amended the commercial driver's license (CDL) knowledge and skills testing standards and established new minimum Federal standards for States to issue the commercial...

  20. Standards Interoperability: Application of Contemporary Software Safety Assurance Standards to the Evolution of Legacy Software

    National Research Council Canada - National Science Library

    Meacham, Desmond J

    2006-01-01

    .... The proposed formal model is then applied to the requirements for RTCA DO-178B and MIL-STD-498 as representative examples of contemporary and legacy software standards. The results provide guidance on how to achieve airworthiness certification for modified legacy software, whilst maximizing the use of software products from the previous development.

  1. A company perspective on software engineering standards

    International Nuclear Information System (INIS)

    Steer, R.W.

    1988-01-01

    Software engineering standards, as implemented via formal policies and procedures, have historically been used in the nuclear industry, especially for codes used in the design, analysis, or operation of the plant. Over the past two decades, a significant amount of software has been put in place to perform these functions, while the overall software life cycle has become better understood, more and different computer systems have become available, and industry has become increasingly aware of the advantages gained when these procedures are used in the development and maintenance of this large amount of software. The use of standards and attendant procedures is thus becoming increasingly important as more computerization is taking place, both in the design and the operation of the plant. It is difficult to categorize software used in activities related to nuclear plants in a simple manner. That difficulty is due to the diversity of those uses, with attendant diversity in the methods and procedures used in the production of the software, compounded by a changing business climate in which significant software engineering expertise is being applied to a broader range of applications on a variety of computing systems. The use of standards in the various phases of the production of software thus becomes more difficult as well. This paper discusses the various types of software and the importance of software standards in the development of each of them

  2. ESSCOTS for Learning: Transforming Commercial Software into Powerful Educational Tools.

    Science.gov (United States)

    McArthur, David; And Others

    1995-01-01

    Gives an overview of Educational Support Systems based on commercial off-the-shelf software (ESSCOTS), and discusses the benefits of developing such educational software. Presents results of a study that revealed the learning processes of middle and high school students who used a geographical information system. (JMV)

  3. Diversification and Challenges of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  4. Future of Software Engineering Standards

    Science.gov (United States)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  5. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  6. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  7. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  8. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  9. Regional vegetation management standards for commercial pine ...

    African Journals Online (AJOL)

    Although the understanding gained from these trials allowed for the development of vegetation management standards, their operational and economic viability need to be tested on a commercial basis. Four pine trials were thus initiated to test the applicability of these standards when utilised on a commercial scale. Two of ...

  10. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  11. Measuring the Software Product Quality during the Software Development Life-Cycle: An ISO Standards Perspective

    OpenAIRE

    Rafa E. Al-Qutaish

    2009-01-01

    Problem statement: The International Organization for Standardization (ISO) published a set of international standards related to the software engineering, such as ISO 12207 and ISO 9126. However, there is a set of cross-references between the two standards. Approach: The ISO 9126 on software product quality and ISO 12207 on software life cycle processes had been analyzed to invistigate the relationships between them and to make a mapping from the ISO 9126 quality characteristics to the ISO 1...

  12. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  13. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    Science.gov (United States)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  14. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  15. Buying in to bioinformatics: an introduction to commercial sequence analysis software.

    Science.gov (United States)

    Smith, David Roy

    2015-07-01

    Advancements in high-throughput nucleotide sequencing techniques have brought with them state-of-the-art bioinformatics programs and software packages. Given the importance of molecular sequence data in contemporary life science research, these software suites are becoming an essential component of many labs and classrooms, and as such are frequently designed for non-computer specialists and marketed as one-stop bioinformatics toolkits. Although beautifully designed and powerful, user-friendly bioinformatics packages can be expensive and, as more arrive on the market each year, it can be difficult for researchers, teachers and students to choose the right software for their needs, especially if they do not have a bioinformatics background. This review highlights some of the currently available and most popular commercial bioinformatics packages, discussing their prices, usability, features and suitability for teaching. Although several commercial bioinformatics programs are arguably overpriced and overhyped, many are well designed, sophisticated and, in my opinion, worth the investment. If you are just beginning your foray into molecular sequence analysis or an experienced genomicist, I encourage you to explore proprietary software bundles. They have the potential to streamline your research, increase your productivity, energize your classroom and, if anything, add a bit of zest to the often dry detached world of bioinformatics. © The Author 2014. Published by Oxford University Press.

  16. Round table discussion: Quality control and standardization of nuclear medicine software

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    In summary the round table came to the following important conclusions: Nuclear medicine software systems need better documentation, especially regarding details of algorithms and limitations, and user friendliness could be considerably improved. Quality control of software is an integral part of quality assurance in nuclear medicine and should be performed at all levels of the software. Quality control of applications software should preferably be performed with assistance of generally accepted software phantoms. A basic form of standardization was welcomed and partly regarded as essential by all participants. Some areas such as patient study files could be standardized in the near future, whereas other areas such as the standardization of clinical applications programs or acquisition protocols still present major difficulties. An international cooperation in the field of standardization of software and other topics has already been started on the European level and should be continued and supported. (orig.)

  17. Software evolution and maintenance

    CERN Document Server

    Tripathy, Priyadarshi

    2014-01-01

    Software Evolution and Maintenance: A Practitioner's Approach is an accessible textbook for students and professionals, which collates the advances in software development and provides the most current models and techniques in maintenance.Explains two maintenance standards: IEEE/EIA 1219 and ISO/IEC14764Discusses several commercial reverse and domain engineering toolkitsSlides for instructors are available onlineInformation is based on the IEEE SWEBOK (Software Engineering Body of Knowledge)

  18. Performance assessment of the commercial CFD software for the prediction of the PWR internal flow - Corrected version

    International Nuclear Information System (INIS)

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong; Cheong, Ae Ju; Kim, Do Hyeong; Kang, Min Ku

    2013-01-01

    As the computer hardware technology develops the license applicants for nuclear power plant use the commercial CFD software with the aim of reducing the excessive conservatism associated with using simplified and conservative analysis tools. Even if some of CFD software developers and its users think that a state of the art CFD software can be used to solve reasonably at least the single-phase nuclear reactor safety problems there is still the limitations and the uncertainties in the calculation result. From a regulatory perspective, Korea Institute of Nuclear Safety (KINS) has been presently conducting the performance assessment of the commercial CFD software for the nuclear reactor safety problems. In this study, in order to examine the prediction performance of the commercial CFD software with the porous model in the analysis of the scale-down APR+ (Advanced Power Reactor Plus) internal flow, simulation was conducted with the on-board numerical models in ANSYS CFX R.14 and FLUENT R.14. It was concluded that depending on the CFD software the internal flow distribution of the scale-down APR+ was locally some-what different. Although there was a limitation in estimating the prediction performance of the commercial CFD software due to the limited number of the measured data, CFXR.14 showed the more reasonable predicted results in comparison with FLUENT R.14. Meanwhile, due to the difference of discretization methodology, FLUENT R.14 required more computational memory than CFX R.14 for the same grid system. Therefore the CFD software suitable to the available computational resource should be selected for the massive parallel computation. (authors)

  19. 75 FR 32983 - Commercial Driver's License (CDL) Standards: Exemption

    Science.gov (United States)

    2010-06-10

    ...-28480] Commercial Driver's License (CDL) Standards: Exemption AGENCY: Federal Motor Carrier Safety... commercial driver's license (CDL) as required by current regulations. FMCSA reviewed NAAA's application for... demonstrate alternatives its members would employ to ensure that their commercial motor vehicle (CMV) drivers...

  20. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide an interpre......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We...... primarily on complex, formal partnerships, in which also opportunistic behavior occurs and where informal relations are invaluable sources of knowledge. In addition, the original software producer’s view and treatment of these companies has a vital impact on the customizing company’s practice which...

  1. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  2. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  3. Company's Unusual Plan to Package Commercial Software with Business Textbooks Produces a Measure of Success.

    Science.gov (United States)

    Watkins, Beverly T.

    1992-01-01

    Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)

  4. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  5. Issues and relationships among software standards for nuclear safety applications. Version 2.0

    International Nuclear Information System (INIS)

    Scott, J.A.; Preckshot, G.G.; Lawrence, J.D.; Johnson, G.L.

    1996-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the development of draft regulatory guides for selected software engineering standards. This report describes the results of the initial task in this work. The selected software standards and a set of related software engineering standards were reviewed, and the resulting preliminary elements of the regulatory positions are identified in this report. The importance of a thorough understanding of the relationships among standards useful for developing safety-related software is emphasized. The relationship of this work to the update of the Standard Review Plan is also discussed

  6. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  7. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  8. Software for the IAEA Occupational Radiation Protection Standards

    International Nuclear Information System (INIS)

    Mocaun, N.M.; Paul, F.; Griffith, R.V.; Gustafsson, M.; Webb, G.A.M.; Enache, A.

    2000-01-01

    The software version of International Basic Safety Standards (BSS) for Protection against Ionizing Radiation and for the Safety of Radiation Sources, jointly sponsored by Food and Agriculture Organization of the United Nations, International Atomic Energy Agency, International Labour Organization, Nuclear Energy Agency of the Organization for Economic Co-operation and Development, Pan American Health Organization and World Health Organization, was issued on diskette (SS115 software version) by IAEA in 1997. This Windows based software was written in Visual Basic and is designed to provide the user with a powerful and flexible retrieval system to access the 364 page BSS. The code enables the user to search the BSS, including 22 tables and 254 topics, directly through the 'contents' tree. Access is based on keywords, subjects index or cross referencing between portions of the document dealing with different aspects of the same issue or concept. Definitions of important terms used in the Standards can be found by accessing the Glossary. Text and data can be extracted using familiar copy, paste and print features. Publication of three Safety Guides on Occupational Radiation Protection, with co-sponsorship of the IAEA and International Labour Office, is planned for the second half of 1999. The same system will be used to provide these on diskette or CD-ROM (ORPGUIDE version 4.1). The new software will include the Safety Guides: Occupational Radiation Protection, Assessment of Occupational Exposure due to Intakes of Radionuclides, and Assessment of Occupational Exposure due to External Sources of Radiation, as well as the Bss and the Safety Fundamentals, Radiation Protection and the Safety of Radiation Sources. The capabilities of the new software have been expanded to include free form text search and cross referencing of the five documents which will comprise the guidance of the IAEA and its co-sponsors on Occupational Radiation Protection. It is envisioned that the

  9. Quench Simulation of Superconducting Magnets with Commercial Multiphysics Software

    CERN Document Server

    AUTHOR|(SzGeCERN)751171; Auchmann, Bernhard; Jarkko, Niiranen; Maciejewski, Michal

    The simulation of quenches in superconducting magnets is a multiphysics problem of highest complexity. Operated at 1.9 K above absolute zero, the material properties of superconductors and superfluid helium vary by several orders of magnitude over a range of only 10 K. The heat transfer from metal to helium goes through different transfer and boiling regimes as a function of temperature, heat flux, and transferred energy. Electrical, magnetic, thermal, and fluid dynamic effects are intimately coupled, yet live on vastly different time and spatial scales. While the physical models may be the same in all cases, it is an open debate whether the user should opt for commercial multiphysics software like ANSYS or COMSOL, write customized models based on general purpose network solvers like SPICE, or implement the physics models and numerical solvers entirely in custom software like the QP3, THEA, and ROXIE codes currently in use at the European Organisation for Nuclear Research (CERN). Each approach has its strengt...

  10. MRI/TRUS fusion software-based targeted biopsy: the new standard of care?

    Science.gov (United States)

    Manfredi, M; Costa Moretti, T B; Emberton, M; Villers, A; Valerio, M

    2015-09-01

    The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.

  11. Software methodologies for the SSC

    International Nuclear Information System (INIS)

    Loken, S.C.

    1990-01-01

    This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

  12. Software life cycle management standards real-world solutions and scenarios for savings

    CERN Document Server

    Wright, David

    2011-01-01

    Software Life Cycle Management Standards will help you apply ISO/IEC 19770 to your business and enjoy the rewards it offers. David Wright calls on his vast experience to explain how the Standard applies to the whole of the software life cycle, not just the software asset management aspects. His informative guide gives up-to-date information using practical examples, clear diagrams and entertaining anecdotes.

  13. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  14. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  15. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  16. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    International Nuclear Information System (INIS)

    Kadoya, Noriyuki; Nakajima, Yujiro; Saito, Masahide; Miyabe, Yuki; Kurooka, Masahiko; Kito, Satoshi; Fujita, Yukio; Sasaki, Motoharu; Arai, Kazuhiro; Tani, Kensuke; Yagi, Masashi; Wakita, Akihisa; Tohyama, Naoki; Jingu, Keiichi

    2016-01-01

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D

  17. IEEE [Institute of Electrical and Electronics Engineers] standards and nuclear software quality engineering

    International Nuclear Information System (INIS)

    Daughtrey, T.

    1988-01-01

    Significant new nuclear-specific software standards have recently been adopted under the sponsorship of the American Nuclear Society and the American Society of Mechanical Engineers. The interest of the US Nuclear Regulatory Commission has also been expressed through their issuance of NUREG/CR-4640. These efforts all indicate a growing awareness of the need for thorough, referenceable expressions of the way to build in and evaluate quality in nuclear software. A broader professional perspective can be seen in the growing number of software engineering standards sponsored by the Institute of Electrical and Electronics Engineers (IEEE) Computer Society. This family of standards represents a systematic effort to capture professional consensus on quality practices throughout the software development life cycle. The only omission-the implementation phase-is treated by accepted American National Standards Institute or de facto standards for programming languages

  18. [Development of a software standardizing optical density with operation settings related to several limitations].

    Science.gov (United States)

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  19. An IMRT dose distribution study using commercial verification software

    International Nuclear Information System (INIS)

    Grace, M.; Liu, G.; Fernando, W.; Rykers, K.

    2004-01-01

    Full text: The introduction of IMRT requires users to confirm that the isodose distributions and relative doses calculated by their planning system match the doses delivered by their linear accelerators. To this end the commercially available software, VeriSoft TM (PTW-Freiburg, Germany) was trialled to determine if the tools and functions it offered would be of benefit to this process. The CMS Xio (Computer Medical System) treatment planning system was used to generate IMRT plans that were delivered with an upgraded Elekta SL15 linac. Kodak EDR2 film sandwiched in RW3 solid water (PTW-Freiburg, Germany) was used to measure the IMRT fields delivered with 6 MV photons. The isodose and profiles measured with the film generally agreed to within ± 3% or ± 3 mm with the planned doses, in some regions (outside the IMRT field) the match fell to within ± 5%. The isodose distributions of the planning system and the film could be compared on screen and allows for electronic records of the comparison to be kept if so desired. The features and versatility of this software has been of benefit to our IMRT QA program. Furthermore, the VeriSoft TM software allows for quick and accurate, automated planar film analysis.Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  20. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL, for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.

  1. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software

    Science.gov (United States)

    Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun

    2017-01-01

    In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow. PMID:28264062

  2. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  3. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  4. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  5. Software architecture standard for simulation virtual machine, version 2.0

    Science.gov (United States)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  6. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  7. National Software Capacity: Near-Term Study

    Science.gov (United States)

    1990-05-01

    34 sweatshops " [Singhal 90]. Because they work for below market wages, they allow software development costs in the commercial sector to be reduced or...arrangements. Presently, the command/management director is far too often at a technological disadvantage because of the job assignment structure. A...facto commercial standards on the supply of both raw and skilled labor needs to be evaluated in light of the purely technological disadvantages or

  8. 78 FR 73589 - Energy Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric...

    Science.gov (United States)

    2013-12-06

    ... Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric Motors; Proposed... Conservation Program: Energy Conservation Standards for Commercial and Industrial Electric Motors AGENCY... proposes energy conservation standards for a number of different groups of electric motors that DOE has not...

  9. Software database creature for investment property measurement according to international standards

    Science.gov (United States)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  10. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  11. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  12. 75 FR 52378 - Transfer of Commercial Standard Mail Parcels to Competitive Product List

    Science.gov (United States)

    2010-08-25

    ..., 2010, the United States Postal Service[reg] filed with the Postal Regulatory Commission a Request of the United States Postal Service to transfer commercial Standard Mail Parcels from the Mail... POSTAL SERVICE Transfer of Commercial Standard Mail Parcels to Competitive Product List AGENCY...

  13. Accounting treatment of software development costs according to applicable accounting standards

    Directory of Open Access Journals (Sweden)

    Dilyana Markova

    2017-05-01

    Full Text Available The growth of the software sector worldwide is ahead of the creation and updating of accounting standards that regulate the reporting of the products and services it creates. Applicable standards across countries are interpreted differently and that lead to incomplete reports. This impose the adoption and application of explanations to give a specific guidelines and rules on the accounting treatment of R & D expenditure at each phase of the software project life cycle and disclosure of the information in the financial statements.

  14. Commercialization and Standardization Progress Towards an Optical Communications Earth Relay

    Science.gov (United States)

    Edwards, Bernard L.; Israel, David J.

    2015-01-01

    NASA is planning to launch the next generation of a space based Earth relay in 2025 to join the current Space Network, consisting of Tracking and Data Relay Satellites in space and the corresponding infrastructure on Earth. While the requirements and architecture for that relay satellite are unknown at this time, NASA is investing in communications technologies that could be deployed to provide new communications services. One of those new technologies is optical communications. The Laser Communications Relay Demonstration (LCRD) project, scheduled for launch in 2018 as a hosted payload on a commercial communications satellite, is a critical pathfinder towards NASA providing optical communications services on the next generation space based relay. This paper will describe NASA efforts in the on-going commercialization of optical communications and the development of inter-operability standards. Both are seen as critical to making optical communications a reality on future NASA science and exploration missions. Commercialization is important because NASA would like to eventually be able to simply purchase an entire optical communications terminal from a commercial provider. Inter-operability standards are needed to ensure that optical communications terminals developed by one vendor are compatible with the terminals of another. International standards in optical communications would also allow the space missions of one nation to use the infrastructure of another.

  15. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  16. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  17. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  18. Commercial software upgrades may significantly alter Perfusion CT parameter values in colorectal cancer

    International Nuclear Information System (INIS)

    Goh, Vicky; Shastry, Manu; Endozo, Raymondo; Groves, Ashley M.; Engledow, Alec; Peck, Jacqui; Reston, Jonathan; Wellsted, David M.; Rodriguez-Justo, Manuel; Taylor, Stuart A.; Halligan, Steve

    2011-01-01

    To determine how commercial software platform upgrades impact on derived parameters for colorectal cancer. Following ethical approval, 30 patients with suspected colorectal cancer underwent Perfusion CT using integrated 64 detector PET/CT before surgery. Analysis was performed using software based on modified distributed parameter analysis (Perfusion software version 4; Perfusion 4.0), then repeated using the previous version (Perfusion software version 3; Perfusion 3.0). Tumour blood flow (BF), blood volume (BV), mean transit time (MTT) and permeability surface area product (PS) were determined for identical regions-of-interest. Slice-by-slice and 'whole tumour' variance was assessed by Bland-Altman analysis. Mean BF, BV and PS was 20.4%, 59.5%, and 106% higher, and MTT 14.3% shorter for Perfusion 4.0 than Perfusion 3.0. The mean difference (95% limits of agreement) were +13.5 (-44.9 to 72.0), +2.61 (-0.06 to 5.28), -1.23 (-6.83 to 4.36), and +14.2 (-4.43 to 32.8) for BF, BV, MTT and PS respectively. Within subject coefficient of variation was 36.6%, 38.0%, 27.4% and 60.6% for BF, BV, MTT and PS respectively indicating moderate to poor agreement. Software version upgrades of the same software platform may result in significantly different parameter values, requiring adjustments for cross-version comparison. (orig.)

  19. Challenges in Commercial Buildings | Buildings | NREL

    Science.gov (United States)

    systems Assessing the energy and economic impacts of various technologies, giving priority to those that standardized language for commercial building energy audit data that can be used by software developers to exchange data between audit tools, and can be required by building owners and audit program managers to

  20. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  1. The role of open-source software in innovation and standardization in radiology.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve; Nagy, Paul

    2005-11-01

    The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.

  2. The Image of User Instructions: Comparing Users' Expectations of and Experiences with an Official and a Commercial Software Manual

    NARCIS (Netherlands)

    de Jong, Menno D.T.; Karreman, Joyce

    2017-01-01

    Purpose: The market for (paid-for) commercial software manuals is flourishing, while (free) official manuals are often assumed to be neglected by users. To investigate differences in user perceptions of commercial and official manuals, we conducted two studies: one focusing on user expectations and

  3. Standards guide for space and earth sciences computer software

    Science.gov (United States)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  4. A Model for Joint Software Reviews

    Science.gov (United States)

    1998-10-01

    CEPMAN 1, 1996; Gabb, 1997], and with the growing popularity of outsourcing, they are becoming more important in the commercial sector [ ISO /IEC 12207 ...technical and management reviews [MIL-STD-498, 1996; ISO /IEC 12207 , 1995]. Management reviews occur after technical reviews, and are focused on the cost...characteristics, Standard (No. ISO /IEC 9126-1). [ ISO /IEC 12207 , 1995] Information Technology Software Life Cycle Processes, Standard (No. ISO /IEC 12207

  5. Two‐year experience with the commercial Gamma Knife Check software

    Science.gov (United States)

    Bhatnagar, Jagdish; Bednarz, Greg; Novotny, Josef; Flickinger, John; Lunsford, L. Dade; Huq, M. Saiful

    2016-01-01

    The Gamma Knife Check software is an FDA approved second check system for dose calculations in Gamma Knife radiosurgery. The purpose of this study was to evaluate the accuracy and the stability of the commercial software package as a tool for independent dose verification. The Gamma Knife Check software version 8.4 was commissioned for a Leksell Gamma Knife Perfexion and a 4C unit at the University of Pittsburgh Medical Center in May 2012. Independent dose verifications were performed using this software for 319 radiosurgery cases on the Perfexion and 283 radiosurgery cases on the 4C units. The cases on each machine were divided into groups according to their diagnoses, and an averaged absolute percent dose difference for each group was calculated. The percentage dose difference for each treatment target was obtained as the relative difference between the Gamma Knife Check dose and the dose from the tissue maximum ratio algorithm (TMR 10) from the GammaPlan software version 10 at the reference point. For treatment plans with imaging skull definition, results obtained from the Gamma Knife Check software using the measurement‐based skull definition method are used for comparison. The collected dose difference data were also analyzed in terms of the distance from the treatment target to the skull, the number of treatment shots used for the target, and the gamma angles of the treatment shots. The averaged percent dose differences between the Gamma Knife Check software and the GammaPlan treatment planning system are 0.3%, 0.89%, 1.24%, 1.09%, 0.83%, 0.55%, 0.33%, and 1.49% for the trigeminal neuralgia, acoustic neuroma, arteriovenous malformation (AVM), meningioma, pituitary adenoma, glioma, functional disorders, and metastasis cases on the Perfexion unit. The corresponding averaged percent dose differences for the 4C unit are 0.33%, 1.2%, 2.78% 1.99%, 1.4%, 1.92%, 0.62%, and 1.51%, respectively. The dose difference is, in general, larger for treatment targets in the

  6. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  7. Transport behaviour of commercially available 100-Omega standard resistors

    CSIR Research Space (South Africa)

    Schumacher, B

    2001-04-01

    Full Text Available Several types of commercial 100-Omega resistors can be used with the cryogenic current comparator to maintain the resistance unit, derived from the Quantized Hall Effect (QHE), and to disseminate this unit to laboratory resistance standards. Up...

  8. Employing industrial standards in software engineering for W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany)], E-mail: kuehner@ipp.mpg.de; Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Krueger, Alexander [University of Applied Sciences, Schwedenschanze 135, 18435 Stralsund (Germany); Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Riemann, Heike; Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Teilinstitut Greifswald, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2009-06-15

    The stellarator W7X is a large complex experiment designed for continuous operation and planned to be operated for about 20 years. Software support is highly demanded for experiment preparation, operation and data analysis which in turn induces serious non-functional requirements on the software quality like, e.g.: {center_dot}high availability, stability, maintainability vs. {center_dot}high flexibility concerning change of functionality, technology, personnel {center_dot}high versatility concerning the scale of system size and performance These challenges are best met by exploiting industrial experience in quality management and assurance (QM/QA), e.g. focusing on top-down development methods, developing an integral functional system model, using UML as a diagramming standard, building vertical prototypes, support for distributed development, etc., which have been used for W7X, however on an 'as necessary' basis. Proceeding in this manner gave significant results for control, data acquisition, corresponding database-structures and user applications over many years. As soon as production systems started using the software in the labs or on a prototype the development activity demanded to be organized in a more rigorous process mainly to provide stable operation conditions. Thus a process improvement activity was started for stepwise introduction of quality assuring processes with tool support taking standards like CMMI, ISO-15504 (SPICE) as a guideline. Experiences obtained so far will be reported. We conclude software engineering and quality assurance has to be an integral part of systems engineering right from the beginning of projects and be organized according to industrial standards to be prepared for the challenges of nuclear fusion research.

  9. 76 FR 67480 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-11-01

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... existing Standard on Commercial Diving Operations (29 CFR part 1910, Subpart [[Page 67481

  10. Application of industry-standard guidelines for the validation of avionics software

    Science.gov (United States)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  11. Improved detection of pulmonary nodules on energy-subtracted chest radiographs with a commercial computer-aided diagnosis software: comparison with human observers

    International Nuclear Information System (INIS)

    Szucs-Farkas, Zsolt; Patak, Michael A.; Yuksel-Hatz, Seyran; Ruder, Thomas; Vock, Peter

    2010-01-01

    To retrospectively analyze the performance of a commercial computer-aided diagnosis (CAD) software in the detection of pulmonary nodules in original and energy-subtracted (ES) chest radiographs. Original and ES chest radiographs of 58 patients with 105 pulmonary nodules measuring 5-30 mm and images of 25 control subjects with no nodules were randomized. Five blinded readers evaluated firstly the original postero-anterior images alone and then together with the subtracted radiographs. In a second phase, original and ES images were analyzed by a commercial CAD program. CT was used as reference standard. CAD results were compared to the readers' findings. True-positive (TP) and false-positive (FP) findings with CAD on subtracted and non-subtracted images were compared. Depending on the reader's experience, CAD detected between 11 and 21 nodules missed by readers. Human observers found three to 16 lesions missed by the CAD software. CAD used with ES images produced significantly fewer FPs than with non-subtracted images: 1.75 and 2.14 FPs per image, respectively (p=0.029). The difference for the TP nodules was not significant (40 nodules on ES images and 34 lesions in non-subtracted radiographs, p = 0.142). CAD can improve lesion detection both on energy subtracted and non-subtracted chest images, especially for less experienced readers. The CAD program marked less FPs on energy-subtracted images than on original chest radiographs. (orig.)

  12. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  13. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  14. 76 FR 38153 - California State Nonroad Engine Pollution Control Standards; Commercial Harbor Craft Regulations...

    Science.gov (United States)

    2011-06-29

    ... Standards; Commercial Harbor Craft Regulations; Opportunity for Public Hearing and Comment AGENCY... engines on commercial harbor craft. CARB has requested that EPA issue a new authorization under [email protected] . SUPPLEMENTARY INFORMATION: I. California's Commercial Harbor Craft Regulations In a...

  15. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  16. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  17. Energy efficiency standards for residential and commercial equipment: Additional opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Rosenquist, Greg; McNeil, Michael; Iyer, Maithili; Meyers, Steve; McMahon, Jim

    2004-08-02

    Energy efficiency standards set minimum levels of energy efficiency that must be met by new products. Depending on the dynamics of the market and the level of the standard, the effect on the market for a given product may be small, moderate, or large. Energy efficiency standards address a number of market failures that exist in the buildings sector. Decisions about efficiency levels often are made by people who will not be responsible for the energy bill, such as landlords or developers of commercial buildings. Many buildings are occupied for their entire lives by very temporary owners or renters, each unwilling to make long-term investments that would mostly reward subsequent users. And sometimes what looks like apathy about efficiency merely reflects inadequate information or time invested to evaluate it. In addition to these sector-specific market failures, energy efficiency standards address the endemic failure of energy prices to incorporate externalities. In the U.S., energy efficiency standards for consumer products were first implemented in California in 1977. National standards became effective starting in 1988. By the end of 2001, national standards were in effect for over a dozen residential appliances, as well as for a number of commercial sector products. Updated standards will take effect in the next few years for several products. Outside the U.S., over 30 countries have adopted minimum energy performance standards. Technologies and markets are dynamic, and additional opportunities to improve energy efficiency exist. There are two main avenues for extending energy efficiency standards. One is upgrading standards that already exist for specific products. The other is adopting standards for products that are not covered by existing standards. In the absence of new and upgraded energy efficiency standards, it is likely that many new products will enter the stock with lower levels of energy efficiency than would otherwise be the case. Once in the stock

  18. Software System for the Calibration of X-Ray Measuring Instruments

    International Nuclear Information System (INIS)

    Gaytan-Gallardo, E.; Tovar-Munoz, V. M.; Cruz-Estrada, P.; Vergara-Martinez, F. J.; Rivero-Gutierrez, T.

    2006-01-01

    A software system that facilities the calibration of X-ray measuring instruments used in medical applications is presented. The Secondary Standard Dosimetry Laboratory (SSDL) of the Nuclear Research National Institute in Mexico (ININ in Spanish), supports activities concerning with ionizing radiations in medical area. One of these activities is the calibration of X-ray measuring instruments, in terms of air kerma or exposure by substitution method in an X-ray beam at a point where the rate has been determined by means of a standard ionization chamber. To automatize this process, a software system has been developed, the calibration system is composed by an X-ray unit, a Dynalizer IIIU X-ray meter by RADCAL, a commercial data acquisition card, the software system and the units to be tested and calibrated. A quality control plan has been applied in the development of the software system, ensuring that quality assurance procedures and standards are being followed

  19. A ternary phase-field model incorporating commercial CALPHAD software and its application to precipitation in superalloys

    International Nuclear Information System (INIS)

    Wen, Y.H.; Lill, J.V.; Chen, S.L.; Simmons, J.P.

    2010-01-01

    A ternary phase-field model was developed that is linked directly to commercial CALPHAD software to provide quantitative thermodynamic driving forces. A recently available diffusion mobility database for ordered phases is also implemented to give a better description of the diffusion behavior in alloys. Because the targeted application of this model is the study of precipitation in Ni-based superalloys, a Ni-Al-Cr model alloy was constructed. A detailed description of this model is given in the paper. We have considered the misfit effects of the partitioning of the two solute elements. Transformation rules of the dual representation of the γ+γ ' microstructure by CALPHAD and by the phase field are established and the link with commercial CALPHAD software is described. Proof-of-concept tests were performed to evaluate the model and the results demonstrate that the model can qualitatively reproduce observed γ ' precipitation behavior. Uphill diffusion of Al is observed in a few diffusion couples, showing the significant influence of Cr on the chemical potential of Al. Possible applications of this model are discussed.

  20. Inside a VAMDC data node—putting standards into practical software

    Science.gov (United States)

    Regandell, Samuel; Marquart, Thomas; Piskunov, Nikolai

    2018-03-01

    Access to molecular and atomic data is critical for many forms of remote sensing analysis across different fields. Many atomic and molecular databases are however highly specialised for their intended application, complicating querying and combination data between sources. The Virtual Atomic and Molecular Data Centre, VAMDC, is an electronic infrastructure that allows each database to register as a ‘node’. Through services such as VAMDC’s portal website, users can then access and query all nodes in a homogenised way. Today all major Atomic and Molecular databases are attached to VAMDC This article describes the software tools we developed to help data providers create and manage a VAMDC node. It gives an overview of the VAMDC infrastructure and of the various standards it uses. The article then discusses the development choices made and how the standards are implemented in practice. It concludes with a full example of implementing a VAMDC node using a real-life case as well as future plans for the node software.

  1. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    Science.gov (United States)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  2. Progress on standardization and automation in software development on W7X

    International Nuclear Information System (INIS)

    Kühner, Georg; Bluhm, Torsten; Heimann, Peter; Hennig, Christine; Kroiss, Hugo; Krom, Jon; Laqua, Heike; Lewerentz, Marc; Maier, Josef; Schacht, Jörg; Spring, Anett; Werner, Andreas; Zilker, Manfred

    2012-01-01

    Highlights: ► For W7X software development the use of ISO/IEC15504-5 is further extended. ► The standard provides a basis to manage software multi-projects for a large system project. ► Adoption of a scrum-like management allows for quick reaction on priority changes. ► A high degree of software build automation allows for quick responses to user requests. ► It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: –Continuous build, integration and automated test of software artefacts. ∘Syntax checks and code quality metrics. ∘Documentation generation. ∘Feedback for developers by temporal statistics. –Versioned repository for build products (libraries, executables). –Separate snapshot and release repositories and automatic deployment. –Semi-automatic provisioning of applications. –Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized interfaces providing the capability to construct arbitrary function aggregates for dedicated tests of quality attributes as availability, reliability

  3. Progress on standardization and automation in software development on W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg, E-mail: kuehner@ipp.mpg.de [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Krom, Jon; Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer For W7X software development the use of ISO/IEC15504-5 is further extended. Black-Right-Pointing-Pointer The standard provides a basis to manage software multi-projects for a large system project. Black-Right-Pointing-Pointer Adoption of a scrum-like management allows for quick reaction on priority changes. Black-Right-Pointing-Pointer A high degree of software build automation allows for quick responses to user requests. Black-Right-Pointing-Pointer It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: -Continuous build, integration and automated test of software artefacts. Ring-Operator Syntax checks and code quality metrics. Ring-Operator Documentation generation. Ring-Operator Feedback for developers by temporal statistics. -Versioned repository for build products (libraries, executables). -Separate snapshot and release repositories and automatic deployment. -Semi-automatic provisioning of applications. -Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized

  4. Software-Defined Solutions for Managing Energy Use in Small to Medium Sized Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Peffer, Therese [Univ. of California, Berkeley, CA (United States); Council on International Education Exchange (CIEE), Portland, ME (United States); Blumstein, Carl [Council on International Education Exchange (CIEE), Portland, ME (United States); Culler, David [Univ. of California, Berkeley, CA (United States). Electrical Engineering and Computer Sciences (EECS); Modera, Mark [Univ. of California, Davis, CA (United States). Western Cooling Efficiency Center (WCEC); Meier, Alan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-09-10

    The Project uses state-of-the-art computer science to extend the benefits of Building Automation Systems (BAS) typically found in large buildings (>100,000 square foot) to medium-sized commercial buildings (<50,000 sq ft). The BAS developed in this project, termed OpenBAS, uses an open-source and open software architecture platform, user interface, and plug-and-play control devices to facilitate adoption of energy efficiency strategies in the commercial building sector throughout the United States. At the heart of this “turn key” BAS is the platform with three types of controllers—thermostat, lighting controller, and general controller—that are easily “discovered” by the platform in a plug-and-play fashion. The user interface showcases the platform and provides the control system set-up, system status display and means of automatically mapping the control points in the system.

  5. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  6. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  7. Experience with case tools in the design of process-oriented software

    Science.gov (United States)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  8. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  9. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  10. 78 FR 54197 - Energy Efficiency Program for Commercial and Industrial Equipment: Energy Conservation Standards...

    Science.gov (United States)

    2013-09-03

    .... EERE-2013-BT-STD-0030] RIN 1904-AD01 Energy Efficiency Program for Commercial and Industrial Equipment: Energy Conservation Standards for Commercial Packaged Boilers AGENCY: Office of Energy Efficiency and..., Office of Energy Efficiency and Renewable Energy, Building Technologies Office, EE-2J, 1000 Independence...

  11. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  12. 76 FR 9817 - Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-02-22

    ...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... obtaining information (29 U.S.C. 657). Subpart T applies to diving and related support operations conducted...

  13. An Empirical Study of a Free Software Company

    OpenAIRE

    Pakusch, Cato

    2010-01-01

    Free software has matured well into the commercial software market, yet little qualitative research exists which accurately describes the state of commercial free software today. For this thesis, an instrumental case study was performed on a prominent free software company in Norway. The study found that the commercial free software market is largely driven by social networks, which have a social capital in its own that attracts more people, which in turn become members of the ...

  14. Safety Review related to Commercial Grade Digital Equipment in Safety System

    International Nuclear Information System (INIS)

    Yu, Yeongjin; Park, Hyunshin; Yu, Yeongjin; Lee, Jaeheung

    2013-01-01

    The upgrades or replacement of I and C systems on safety system typically involve digital equipment developed in accordance with non-nuclear standards. However, the use of commercial grade digital equipment could include the vulnerability for software common-mode failure, electromagnetic interference and unanticipated problems. Although guidelines and standards for dedication methods of commercial grade digital equipment are provided, there are some difficulties to apply the methods to commercial grade digital equipment for safety system. This paper focuses on regulatory guidelines and relevant documents for commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. This paper focuses on KINS regulatory guides and relevant documents for dedication of commercial grade digital equipment and presents safety review experiences related to commercial grade digital equipment in safety system. Dedication including critical characteristics is required to use the commercial grade digital equipment on safety system in accordance with KEPIC ENB 6370 and EPRI TR-106439. The dedication process should be controlled in a configuration management process. Appropriate methods, criteria and evaluation result should be provided to verify acceptability of the commercial digital equipment used for safety function

  15. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  16. Technical report on the surface reconstruction of stacked contours by using the commercial software

    Science.gov (United States)

    Shin, Dong Sun; Chung, Min Suk; Hwang, Sung Bae; Park, Jin Seo

    2007-03-01

    After drawing and stacking contours of a structure, which is identified in the serially sectioned images, three-dimensional (3D) image can be made by surface reconstruction. Usually, software is composed for the surface reconstruction. In order to compose the software, medical doctors have to acquire the help of computer engineers. So in this research, surface reconstruction of stacked contours was tried by using commercial software. The purpose of this research is to enable medical doctors to perform surface reconstruction to make 3D images by themselves. The materials of this research were 996 anatomic images (1 mm intervals) of left lower limb, which were made by serial sectioning of a cadaver. On the Adobe Photoshop, contours of 114 anatomic structures were drawn, which were exported to Adobe Illustrator files. On the Maya, contours of each anatomic structure were stacked. On the Rhino, superoinferior lines were drawn along all stacked contours to fill quadrangular surfaces between contours. On the Maya, the contours were deleted. 3D images of 114 anatomic structures were assembled with their original locations preserved. With the surface reconstruction technique, developed in this research, medical doctors themselves could make 3D images of the serially sectioned images such as CTs and MRIs.

  17. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  18. Comparison of ISO 9000 and recent software life cycle standards to nuclear regulatory review guidance

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1998-01-01

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the assessment of certain quality and software life cycle standards to determine whether additional guidance for the U.S. nuclear regulatory context should be derived from the standards. This report describes the nature of the standards and compares the guidance of the standards to that of the recently updated Standard Review Plan

  19. 77 FR 30919 - Commercial Driver's License Testing and Commercial Learner's Permit Standards

    Science.gov (United States)

    2012-05-24

    ..., and 385 [Docket No. FMCSA-2007-27659] Commercial Driver's License Testing and Commercial Learner's... published a final rule titled ``Commercial Driver's License Testing and Commercial Learner's Permit... additional drivers, primarily those transporting certain tanks temporarily attached to the commercial motor...

  20. DIGITAL IMAGE CORRELATION FROM COMMERCIAL TO FOS SOFTWARE: A MATURE TECHNIQUE FOR FULL-FIELD DISPLACEMENT MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    V. Belloni

    2018-05-01

    Full Text Available In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome “La Sapienza”; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome “La Sapienza” and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.

  1. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    Science.gov (United States)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  2. Method of V ampersand V for safety-critical software in NPPs

    International Nuclear Information System (INIS)

    Kim, Jang-Yeol; Lee, Jang-Soo; Kwon, Kee-Choon

    1997-01-01

    Safety-critical software is software used in systems in which a failure could affect personal or equipment safety or result in large financial or social loss. Examples of systems using safety-critical software are systems such as plant protection systems in nuclear power plants (NPPs), process control systems in chemical plants, and medical instruments such as the Therac-25 medical accelerator. This paper presents verification and validation (V ampersand V) methodology for safety-critical software in NPP safety systems. In addition, it addresses issues related to NPP safety systems, such as independence parameters, software safety analysis (SSA) concepts, commercial off-the-shelf (COTS) software evaluation criteria, and interrelationships among software and system assurance organizations. It includes the concepts of existing industrial standards on software V ampersand V, Institute of Electrical and Electronics Engineers (IEEE) Standards 1012 and 1059. This safety-critical software V ampersand V methodology covers V ampersand V scope, a regulatory framework as part of its acceptance criteria, V ampersand V activities and task entrance and exit criteria, reviews and audits, testing and quality assurance records of V ampersand V material, configuration management activities related to V ampersand V, and software V ampersand V (SVV) plan (SVVP) production

  3. Analysis of Potential Benefits and Costs of Adopting a Commercial Building Energy Standard in South Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.

    2005-03-04

    The state of South Dakota is considering adopting a commercial building energy standard. This report evaluates the potential costs and benefits to South Dakota residents from requiring compliance with the most recent edition of the ANSI/ASHRAE/IESNA 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings. These standards were developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The quantitative benefits and costs of adopting a commercial building energy code are modeled by comparing the characteristics of assumed current building practices with the most recent edition of the ASHRAE Standard, 90.1-2001. Both qualitative and quantitative benefits and costs are assessed in this analysis. Energy and economic impacts are estimated using results from a detailed building simulation tool (Building Loads Analysis and System Thermodynamics [BLAST] model) combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits.

  4. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  5. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  6. Leveraging Software Architectures through the ISO/IEC 42010 standard: A Feasibility Study

    NARCIS (Netherlands)

    Tamburri, D.A.; Lago, P.; Muccini, H.; Proper, E.; Lankhorst, M.; Schoenherr, M.

    2011-01-01

    The state of the practice in enterprise and software architecture learnt that relevant architectural aspects should be illustrated in multiple views, targeting the various concerns of different stakeholders. This has been expressed a.o. in the ISO/IEC 42010 Standard on architecture descriptions. In

  7. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  8. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  9. Capacity Management as a Service for Enterprise Standard Software

    Directory of Open Access Journals (Sweden)

    Hendrik Müller

    2017-12-01

    Full Text Available Capacity management approaches optimize component utilization from a strong technical perspective. In fact, the quality of involved services is considered implicitly by linking it to resource capacity values. This practice hinders to evaluate design alternatives with respect to given service levels that are expressed in user-centric metrics such as the mean response time for a business transaction. We argue that utilized historical workload traces often contain a variety of performance-related information that allows for the integration of performance prediction techniques through machine learning. Since enterprise applications excessively make use of standard software that is shipped by large software vendors to a wide range of customers, standardized prediction models can be trained and provisioned as part of a capacity management service which we propose in this article. Therefore, we integrate knowledge discovery activities into well-known capacity planning steps, which we adapt to the special characteristics of enterprise applications. Using a real-world example, we demonstrate how prediction models that were trained on a large scale of monitoring data enable cost-efficient measurement-based prediction techniques to be used in early design and redesign phases of planned or running applications. Finally, based on the trained model, we demonstrate how to simulate and analyze future workload scenarios. Using a Pareto approach, we were able to identify cost-effective design alternatives for an enterprise application whose capacity is being managed.

  10. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be

  11. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  12. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  13. Usability in open source software development

    DEFF Research Database (Denmark)

    Andreasen, M. S.; Nielsen, H. V.; Schrøder, S. O.

    2006-01-01

    Open Source Software (OSS) development has gained significant importance in the production of soft-ware products. Open Source Software developers have produced systems with a functionality that is competitive with similar proprietary software developed by commercial software organizations. Yet OSS...

  14. RTSPM: real-time Linux control software for scanning probe microscopy.

    Science.gov (United States)

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  15. An off-the-shelf guider for the Palomar 200-inch telescope: interfacing amateur astronomy software with professional telescopes for an easy life

    Science.gov (United States)

    Clarke, Fraser; Lynn, James; Thatte, Niranjan; Tecza, Matthias

    2014-08-01

    We have developed a simple but effective guider for use with the Oxford-SWIFT integral field spectrograph on the Palomar 200-inch telescope. The guider uses mainly off-the-shelf components, including commercial amateur astronomy software to interface with the CCD camera, calculating guiding corrections, and send guide commands to the telescope. The only custom piece of software is an driver to provide an interface between the Palomar telescope control system and the industry standard 'ASCOM' system. Using existing commercial software provided a very cheap guider (guiding, and could easily be adapted to any other professional telescope

  16. Modernization of tank floor scanning system (TAFLOSS) Software

    International Nuclear Information System (INIS)

    Mohd Fitri Abd Rahman; Jaafar Abdullah; Zainul A Hassan

    2002-01-01

    The main objective of the project is to develop new user-friendly software that combined the second-generation software (developed in-house) and commercial software. This paper describes the development of computer codes for analysing the initial data and plotting exponential curve fit. The method that used in curve fitting is least square technique. The software that had been developed is capable to give a comparable result as the commercial software. (Author)

  17. A Randomised Controlled Trial of the Use of a Piece of Commercial Software for the Acquisition of Reading Skills

    Science.gov (United States)

    Khan, Muhammad Ahmad; Gorard, Stephen

    2012-01-01

    We report here the overall results of a cluster randomised controlled trial of the use of computer-aided instruction with 672 Year 7 pupils in 23 secondary school classes in the north of England. A new piece of commercial software, claimed on the basis of publisher testing to be effective in improving reading after just six weeks of use in the…

  18. A study on the establishment of safety assessment guidelines of commercial grade item dedication in digitalized safety systems

    International Nuclear Information System (INIS)

    Hwang, H. S.; Kim, B. R.; Oh, S. H.

    1999-01-01

    Because of obsolescing the components used in safety related systems of nuclear power plants, decreasing the number of suppliers qualified for the nuclear QA program and increasing maintenance costs of them, utilities have been considering to use commercial grade digital computers as an alternative for resolving such issues. However, commercial digital computers use the embedded pre-existing software, including operating system software, which are not developed by using nuclear grade QA program. Thus, it is necessary for utilities to establish processes for dedicating digital commercial grade items. A regulatory body also needs guidance to evaluate the digital commercial products properly. This paper surveyed the regulations and their regulatory guides, which establish the requirements for commercial grade items dedication, industry standards and guidances applicable to safety related systems. This paper provides some guidelines to be applied in evaluating the safety of digital upgrades and new digital plant protection systems in Korea

  19. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  20. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  1. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  2. 76 FR 39018 - Commercial Driver's License Testing and Commercial Learner's Permit Standards; Corrections

    Science.gov (United States)

    2011-07-05

    ... [Docket No. FMCSA-2007-27659] RIN 2126-AB02 Commercial Driver's License Testing and Commercial Learner's..., 2011, that will be effective on July 8, 2011. This final rule amends the commercial driver's license... to issue the commercial learner's permit (CLP). Since the final rule was published, FMCSA identified...

  3. Comparison of Standard 90.1-2007 and the 2009 IECC with Respect to Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.; Bartlett, Rosemarie; Halverson, Mark A.

    2009-12-11

    The U.S. Department of Energy’s (DOE’s) Building Energy Codes Program (BECP) has been asked by some states and energy code stakeholders to address the comparability of the 2009 International Energy Conservation Code® (IECC) as applied to commercial buildings and ANSI/ASHRAE/IESNA Standard 90.1-2007 (hereinafter referred to as Standard 90.1-07). An assessment of comparability will help states respond to and implement conditions specified in the State Energy Program (SEP) Formula Grants American Recovery and Reinvestment Act Funding Opportunity, Number DE-FOA-0000052, and eliminate the need for the states individually or collectively to perform comparative studies of the 2009 IECC and Standard 90.1-07. The funding opportunity announcement contains the following conditions: (2) The State, or the applicable units of local government that have authority to adopt building codes, will implement the following: (A) A residential building energy code (or codes) that meets or exceeds the most recent International Energy Conservation Code, or achieves equivalent or greater energy savings. (B) A commercial building energy code (or codes) throughout the State that meets or exceeds the ANSI/ASHRAE/IESNA Standard 90.1-2007, or achieves equivalent or greater energy savings . (C) A plan to achieve 90 percent compliance with the above energy codes within eight years. This plan will include active training and enforcement programs and annual measurement of the rate of compliance. With respect to item (B) above, many more states, regardless of the edition date, directly adopt the IECC than Standard 90.1-07. This is predominately because the IECC is a model code and part of a coordinated set of model building codes that state and local government have historically adopted to regulate building design and construction. This report compares the 2009 IECC to Standard 90.1-07 with the intent of helping states address whether the adoption and application of the 2009 IECC for commercial

  4. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    Science.gov (United States)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  5. 76 FR 17573 - Energy Conservation Standards for Commercial Refrigeration Equipment: Public Meeting and...

    Science.gov (United States)

    2011-03-30

    ... INFORMATION: I. Statutory Authority II. History of Standards Rulemaking for Commercial Refrigeration Equipment... feedback from interested parties on its analytical framework, models, and preliminary results. II. History... equipment installed in the field, such as in grocery stores and restaurants. DOE also carries out additional...

  6. The Commercial Open Source Business Model

    Science.gov (United States)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  7. Space and Missile Systems Center Standard: Software Development

    Science.gov (United States)

    2015-01-16

    waterfall development lifecycle models . Source: Adapted from (IEEE 610.12) See (IEEE 1074) for more information. Software ...spiral, and waterfall lifecycle models .) 2. The developer shall record the selected software development lifecycle model (s) in the Software ...through i.e., waterfall , lifecycle model , the following requirements apply with the interpretation that the software is developed as a single build.

  8. Vertical bone measurements from cone beam computed tomography images using different software packages

    International Nuclear Information System (INIS)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  9. Vertical bone measurements from cone beam computed tomography images using different software packages

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia

    2015-03-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)

  10. Comparison of Chinese and European k0 software

    International Nuclear Information System (INIS)

    Sasajima, Fumio

    2004-01-01

    The element determination by neutron activation analysis is commonly done by the relative method using comparison standard materials. However, when a simultaneous multi-element analysis of an unknown sample is carried out, this method requires an advance preparation of reference material for each content element, their simultaneous irradiation with the sample, and measurement in a same condition. It is indeed a demanding technique with the laborious work such as arranging reference materials. On the other hand, the k 0 method does not usually require reference materials, and allows easier and more accurate simultaneous multi-element analysis, therefore it is widely practiced in many countries including European nations. This report describes two kinds of k 0 software (KAYZERO/SOLCOI, ADVNAA) on their characteristics and the results of environmental standard sample (NIST 1632c, NIES No.8, JB-3) analyses using those software. As a result, both of those software accomplished an accuracy within about 10% in analysis of all but a few elements. They have both drawbacks and advantages in their characteristics and features, although it might not be reasonable to compare two products with different development purposes. commercial, or personal use. (author)

  11. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  12. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  13. XML as a standard I/O data format in scientific software development

    International Nuclear Information System (INIS)

    Song Tianming; Yang Jiamin; Yi Rongqing

    2010-01-01

    XML is an open standard data format with strict syntax rules, which is widely used in large-scale software development. It is adopted as I/O file format in the development of SpectroSim, a simulation and data-processing system for soft x-ray spectrometer used in ICF experiments. XML data that describe spectrometer configurations, schema codes that define syntax rules for XML and report generation technique for visualization of XML data are introduced. The characteristics of XML such as the capability to express structured information, self-descriptive feature, automation of visualization are explained with examples, and its feasibility as a standard scientific I/O data file format is discussed. (authors)

  14. A cognitive mobile BTS solution with software-defined radioelectric sensing.

    Science.gov (United States)

    Muñoz, Jorge; Alonso, Javier Vales; García, Francisco Quiñoy; Costas, Sergio; Pillado, Marcos; Castaño, Francisco Javier González; Sánchez, Manuel García; Valcarce, Roberto López; Bravo, Cristina López

    2013-02-05

    Private communications inside large vehicles such as ships may be effectively provided using standard cellular systems. In this paper we propose a new solution based on software-defined radio with electromagnetic sensing support. Software-defined radio allows low-cost developments and, potentially, added-value services not available in commercial cellular networks. The platform of reference, OpenBTS, only supports single-channel cells. Our proposal, however, has the ability of changing BTS channel frequency without disrupting ongoing communications. This ability should be mandatory in vehicular environments, where neighbouring cell configurations may change rapidly, so a moving cell must be reconfigured in real-time to avoid interferences. Full details about frequency occupancy sensing and the channel reselection procedure are provided in this paper. Moreover, a procedure for fast terminal detection is proposed. This may be decisive in emergency situations, e.g., if someone falls overboard. Different tests confirm the feasibility of our proposal and its compatibility with commercial GSM terminals.

  15. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  16. Ground control station software design for micro aerial vehicles

    Science.gov (United States)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  17. EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE

    CERN Document Server

    Yusof, Adib

    2015-01-01

    My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H→μ+ μ) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...

  18. 76 FR 3517 - Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial...

    Science.gov (United States)

    2011-01-20

    ... Standards of Performance for Fossil-Fuel-Fired, Electric Utility, Industrial-Commercial-Institutional, and... following: Category NAICS \\1\\ Examples of regulated entities Industry 221112 Fossil fuel-fired electric utility steam generating units. Federal Government 22112 Fossil fuel-fired electric utility steam...

  19. Academic and Non-Profit Accessibility to Commercial Remote Sensing Software

    Science.gov (United States)

    O'Connor, A. S.; Farr, B.

    2013-12-01

    Remote Sensing as a topic of teaching and research at the university and college level continues to increase. As more data is made freely available and software becomes easier to use, more and more academic and non-profits institutions are turning to remote sensing to solve their tough and large spatial scale problems. Exelis Visual Information Solutions (VIS) has been supporting teaching and research endeavors for over 30 years with a special emphasis over the last 5 years with scientifically proven software and accessible training materials. The Exelis VIS academic program extends to US and Canadian 2 year and 4 year colleges and universities with tools for analyzing aerial and satellite multispectral and hyperspectral imagery, airborne LiDAR and Synthetic Aperture Radar. The Exelis VIS academic programs, using the ENVI Platform, enables labs and classrooms to be outfitted with software and makes software accessible to students. The ENVI software provides students hands on experience with remote sensing software, an easy teaching platform for professors and allows researchers scientifically vetted software they can trust. Training materials are provided at no additional cost and can either serve as a basis for course curriculum development or self paced learning. Non-profit organizations like The Nature Conservancy (TNC) and CGIAR have deployed ENVI and IDL enterprise wide licensing allowing researchers all over the world to have cost effective access COTS software for their research. Exelis VIS has also contributed licenses to the NASA DEVELOP program. Exelis VIS is committed to supporting the academic and NGO community with affordable enterprise licensing, access to training materials, and technical expertise to help researchers tackle today's Earth and Planetary science big data challenges.

  20. Commercial Mobile Alert Service (CMAS) Scenarios

    Science.gov (United States)

    2012-05-01

    Commercial Mobile Alert Service (CMAS) Scenarios The WEA Project Team May 2012 SPECIAL REPORT CMU/SEI-2012-SR-020 CERT® Division, Software ...Homeland Security under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally...DISTRIBUTES IT “AS IS.” References herein to any specific commercial product, process, or service by trade name, trade mark, manufacturer, or otherwise

  1. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  2. Software Intensive Systems

    National Research Council Canada - National Science Library

    Horvitz, E; Katz, D. J; Rumpf, R. L; Shrobe, H; Smith, T. B; Webber, G. E; Williamson, W. E; Winston, P. H; Wolbarsht, James L

    2006-01-01

    .... Additionally, recommend that DoN invest in software engineering, particularly as it complements commercial industry developments and promotes the application of systems engineering methodology...

  3. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  4. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  5. Rapid Prototyping of Standard Compliant Visible Light Communications System

    OpenAIRE

    Gavrincea, Ciprian; Baranda, Jorge; Henarejos, Pol

    2014-01-01

    This article describes the implementation of a prototype visible light communications system based on the IEEE 802.15.7 standard using low-cost commercial off-the-shelf analog devices. The aim of this article is to show that this standard provides a framework that could promote the introduction of applications into the market. Thus, these specifications could be further developed, reducing the gap between the industry and research communities. The implemented prototype makes use of software d...

  6. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  7. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  8. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  9. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  10. A Comparison of Two Commercial Volumetry Software Programs in the Analysis of Pulmonary Ground-Glass Nodules: Segmentation Capability and Measurement Accuracy

    Science.gov (United States)

    Kim, Hyungjin; Lee, Sang Min; Lee, Hyun-Ju; Goo, Jin Mo

    2013-01-01

    Objective To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. Materials and Methods In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. Results The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. Conclusion LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs. PMID:23901328

  11. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    International Nuclear Information System (INIS)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo

    2013-01-01

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  12. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo [Dept. of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of)

    2013-08-15

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  13. Testing of Software Routine to Determine Deviate and Cumulative Probability: ModStandardNormal Version 1.0

    International Nuclear Information System (INIS)

    A.H. Monib

    1999-01-01

    The purpose of this calculation is to document that the software routine ModStandardNomal Version 1.0 which is a Visual Fortran 5.0 module, provides correct results for a normal distribution up to five significant figures (three significant figures at the function tails) for a specified range of input parameters. The software routine may be used for quality affecting work. Two types of output are generated in ModStandardNomal: a deviate, x, given a cumulative probability, p, between 0 and 1; and a cumulative probability, p, given a deviate, x, between -8 and 8. This calculation supports Performance Assessment, under Technical Product Development Plan, TDP-EBS-MD-000006 (Attachment I, DIRS 3) and is written in accordance with the AP-3.12Q Calculations procedure (Attachment I, DIRS 4)

  14. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    The application of NQA-1 Quality Assurance Standards to computer software programs has been recent at the Oak Ridge National Laboratory. One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs. to characterize potential sites for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software

  15. Quantitative 177Lu-SPECT/CT imaging and validation of a commercial dosimetry software

    International Nuclear Information System (INIS)

    D'Ambrosio, L.; Aloj, L.; Morisco, A.; Aurilio, M.; Prisco, A.; Di Gennaro, F.; Lastoria, S.; Madesani, D.

    2015-01-01

    Full text of publication follows. Aim: 3D dosimetry is an appealing yet complex application of SPECT/CT in patients undergoing radionuclide therapy. In this study we have developed a quantitative imaging protocol and we have validated commercially available dosimetry software (Dosimetry Tool-kit Package, GE Heathcare) in patients undergoing 177 Lu-DOTATATE therapy. Materials and methods: dosimetry tool-kit uses multi SPECT/CT and/or WB planar datasets for quantifying changes in radiopharmaceutical uptake over time to determine residence times. This software includes tools for performing reconstruction of SPECT/CT data, registration of all scans to a common reference, segmentation of the different organs, creating time activity curves, curve fitting and calculation of residence times. All acquisitions were performed using a hybrid dual-head SPECT-CT camera (Discovery 670, GE Heathcare) equipped with medium energy collimator using a triple-energy window. SPECT images were reconstructed using an iterative reconstruction algorithm with attenuation, scatter and collimator depth-dependent three-dimensional resolution recovery correction. Camera sensitivity and dead time were evaluated. Accuracy of activity quantification was performed on a large homogeneous source with addition of attenuating/scattering medium. A NEMA/IEC body phantom was utilized to measure the recovery coefficient that the software does not take into account. The residence times for organs at risk were calculated in five patients. OLINDA-EXM software was used to calculate absorbed doses. Results: 177 Lu-sensitivity factor was 13 counts/MBq/s. Dead time was <3% with 1.11 GBq in the field of view. The measured activity was consistent with the decay-corrected calibrated activity for large volumes (>100 cc). The recovery coefficient varied from 0.71 (26.5 ml) to 0.16 (2.5 ml) in the absence of background activity and from 0.58 to 0.13 with a source to background activity concentration ratio 20:1. The

  16. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  17. Software configuration management

    CERN Document Server

    Keyes, Jessica

    2004-01-01

    Software Configuration Management discusses the framework from a standards viewpoint, using the original DoD MIL-STD-973 and EIA-649 standards to describe the elements of configuration management within a software engineering perspective. Divided into two parts, the first section is composed of 14 chapters that explain every facet of configuration management related to software engineering. The second section consists of 25 appendices that contain many valuable real world CM templates.

  18. CT and MR perfusion can discriminate severe cerebral hypoperfusion from perfusion absence: evaluation of different commercial software packages by using digital phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)

    2012-05-15

    Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)

  19. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  20. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  1. DAMARIS – a flexible and open software platform for NMR spectrometer control

    OpenAIRE

    Gädke, Achim; Rosenstihl, Markus; Schmitt, Christopher; Stork, Holger; Nestle, Nikolaus

    2016-01-01

    Home-built NMR spectrometers with self-written control software have a long tradition in porous media research. Advantages of such spectrometers are not just lower costs but also more flexibility in developing new experiments (while commercial NMR systems are typically optimized for standard applications such as spectroscopy, imaging or quality control applications). Increasing complexity of computer operating systems, higher expectations with respect to user-friendliness and graphical use...

  2. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  3. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  4. Numerical simulation of nonequilibrium flows by using the state-to-state approach in commercial software

    Science.gov (United States)

    Kunova, O. V.; Shoev, G. V.; Kudryavtsev, A. N.

    2017-01-01

    Nonequilibrium flows of a two-component oxygen mixture O2/O behind a shock wave are studied with due allowance for the state-to-state vibrational and chemical kinetics. The system of gas-dynamic equations is supplemented with kinetic equations including contributions of VT (TV)-exchange and dissociation processes. A method of the numerical solution of this system with the use of the ANSYS Fluent commercial software package is proposed, which is used in a combination with the authors' code that takes into account nonequilibrium kinetics. The computed results are compared with parameters obtained by solving the problem in the shock-fitting formulation. The vibrational temperature is compared with experimental data. The numerical tool proposed in the present paper is applied to study the flow around a cylinder.

  5. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    Science.gov (United States)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  6. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    International Nuclear Information System (INIS)

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-01-01

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a

  7. High-performance commercial building systems

    Energy Technology Data Exchange (ETDEWEB)

    Selkowitz, Stephen

    2003-10-01

    This report summarizes key technical accomplishments resulting from the three year PIER-funded R&D program, ''High Performance Commercial Building Systems'' (HPCBS). The program targets the commercial building sector in California, an end-use sector that accounts for about one-third of all California electricity consumption and an even larger fraction of peak demand, at a cost of over $10B/year. Commercial buildings also have a major impact on occupant health, comfort and productivity. Building design and operations practices that influence energy use are deeply engrained in a fragmented, risk-averse industry that is slow to change. Although California's aggressive standards efforts have resulted in new buildings designed to use less energy than those constructed 20 years ago, the actual savings realized are still well below technical and economic potentials. The broad goal of this program is to develop and deploy a set of energy-saving technologies, strategies, and techniques, and improve processes for designing, commissioning, and operating commercial buildings, while improving health, comfort, and performance of occupants, all in a manner consistent with sound economic investment practices. Results are to be broadly applicable to the commercial sector for different building sizes and types, e.g. offices and schools, for different classes of ownership, both public and private, and for owner-occupied as well as speculative buildings. The program aims to facilitate significant electricity use savings in the California commercial sector by 2015, while assuring that these savings are affordable and promote high quality indoor environments. The five linked technical program elements contain 14 projects with 41 distinct R&D tasks. Collectively they form a comprehensive Research, Development, and Demonstration (RD&D) program with the potential to capture large savings in the commercial building sector, providing significant economic benefits to

  8. Commercial lumber

    Science.gov (United States)

    Kent A. McDonald; David E. Kretschmann

    1999-01-01

    In a broad sense, commercial lumber is any lumber that is bought or sold in the normal channels of commerce. Commercial lumber may be found in a variety of forms, species, and types, and in various commercial establishments, both wholesale and retail. Most commercial lumber is graded by standardized rules that make purchasing more or less uniform throughout the country...

  9. Need for standardization of methodology and components in commercial radioimmunoassay kits

    Energy Technology Data Exchange (ETDEWEB)

    Wood, W G; Marschner, I; Scriba, P C [Muenchen Univ. (Germany, F.R.). Medizinische Klinik Innenstadt

    1977-01-01

    The problems arising from increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. The problems arising in various RIAs differ according to the substance under test. The quality of individual components is often good, although methodology is often not optimal and contains short-cuts, which although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modification of methodology often leads to major improvements in sensitivity and precision, and this has been demonstrated in the case of three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that in many imported quality control sera from the USA no values have been ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The deductions from this study are that a standardization of kit components and assay methods is desirable in order to allow comparison of results between laboratories using different kits.

  10. Experience implementing energy standards for commercial buildings and its lessons for the Philippines

    Energy Technology Data Exchange (ETDEWEB)

    Busch, John; Deringer, Joseph

    1998-10-01

    Energy efficiency standards for buildings have been adopted in over forty countries. This policy mechanism is pursued by governments as a means of increasing energy efficiency in the buildings sector, which typically accounts for about a third of most nations' energy consumption and half of their electricity consumption. This study reports on experience with implementation of energy standards for commercial buildings in a number of countries and U.S. states. It is conducted from the perspective of providing useful input to the Government of the Philippines' (GOP) current effort at implementing their building energy standard. While the impetus for this work is technical assistance to the Philippines, the intent is to shed light on the broader issues attending implementation of building energy standards that would be applicable there and elsewhere. The background on the GOP building energy standard is presented, followed by the objectives for the study, the approach used to collect and analyze information about other jurisdictions' implementation experience, results, and conclusions and recommendations.

  11. Commercialization of NESSUS: Status

    Science.gov (United States)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  12. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-2001 as the Commercial Building Energy Code in Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Winiarski, David W.; Belzer, David B.; Richman, Eric E.

    2004-09-30

    ASHRAE Standard 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings (hereafter referred to as ASHRAE 90.1-2001 or 90.1-2001) was developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The State of Tennessee is considering adopting ASHRAE 90.1-2001 as its commercial building energy code. In an effort to evaluate whether or not this is an appropriate code for the state, the potential benefits and costs of adopting this standard are considered in this report. Both qualitative and quantitative benefits and costs are assessed. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST) simulations combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits. Tennessee currently has ASHRAE Standard 90A-1980 as the statewide voluntary/recommended commercial energy standard; however, it is up to the local jurisdiction to adopt this code. Because 90A-1980 is the recommended standard, many of the requirements of ASHRAE 90A-1980 were used as a baseline for simulations.

  13. An adaptive software defined radio design based on a standard space telecommunication radio system API

    Science.gov (United States)

    Xiong, Wenhao; Tian, Xin; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2017-05-01

    Software defined radio (SDR) has become a popular tool for the implementation and testing for communications performance. The advantage of the SDR approach includes: a re-configurable design, adaptive response to changing conditions, efficient development, and highly versatile implementation. In order to understand the benefits of SDR, the space telecommunication radio system (STRS) was proposed by NASA Glenn research center (GRC) along with the standard application program interface (API) structure. Each component of the system uses a well-defined API to communicate with other components. The benefit of standard API is to relax the platform limitation of each component for addition options. For example, the waveform generating process can support a field programmable gate array (FPGA), personal computer (PC), or an embedded system. As long as the API defines the requirements, the generated waveform selection will work with the complete system. In this paper, we demonstrate the design and development of adaptive SDR following the STRS and standard API protocol. We introduce step by step the SDR testbed system including the controlling graphic user interface (GUI), database, GNU radio hardware control, and universal software radio peripheral (USRP) tranceiving front end. In addition, a performance evaluation in shown on the effectiveness of the SDR approach for space telecommunication.

  14. The need for standardization of methodology and components in commercial radioimmunoassay kits

    International Nuclear Information System (INIS)

    Wood, W.G.; Marschner, I.; Scriba, P.C.

    1978-01-01

    The problems arising from the increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. These problems differ according to the substance under test. The quality of individual reagents is often good, but the methodology is often not optimal and may contain short-cuts which, although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modifications in the methodology often lead to big improvements in sensitivity and precision. This has been demonstrated in three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that with many quality-control sera imported from the USA no values are ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The study underlines the need to standardize kit components and assay methods to enable the results obtained by different laboratories with different kits to be compared. (author)

  15. Technical Support Document for Version 3.4.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2007-09-14

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  16. Biosafety decisions and perceived commercial risks: The role of GM-free private standards

    OpenAIRE

    Gruère, Guillaume; Sengupta, Debdatta

    2009-01-01

    "We herein investigate the observed discrepancy between real and perceived commercial risks associated with the use of genetically modified (GM) products in developing countries. We focus particularly on the effects of GM-free private standards set up by food companies in Europe and other countries on biotechnology and biosafety policy decisions in food-exporting developing countries. Based on field visits made to South Africa, Namibia, and Kenya in June 2007, and secondary information from t...

  17. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  18. Building a virtual ligand screening pipeline using free software: a survey.

    Science.gov (United States)

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  19. Standardization of Software Application Development and Governance

    Science.gov (United States)

    2015-03-01

    of their systems or applications. DOD systems do not have the luxury of replacing systems at the same pace as commercial companies. DOD has to...is not that the commercial market purposefully sells products that are not complete, but having a 100% complete product requires extensive testing...develop applications for Google ’s Android and Apple ’s iOS devices. Both these companies have SDKs online as well as a number of resources available

  20. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  1. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    Science.gov (United States)

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  2. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  3. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...... surface can provide valuable information about the antenna performance or undesired contributions, e.g. currents on a cable,can be artificially removed. Finally, the CHAMP software will be extended to cover reflector shaping and more complex materials,which combined with a much faster execution speed...

  4. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  5. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  6. Coordination Implications of Software Coupling in Open Source Projects

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Ågerfalk, Pär

    2010-01-01

    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial

  7. First International Workshop on Variability in Software Architecture (VARSA 2011)

    NARCIS (Netherlands)

    Galster, Matthias; Avgeriou, Paris; Weyns, Danny; Mannisto, Tomi

    2011-01-01

    Variability is the ability of a software artifact to be changed for a specific context. Mechanisms to accommodate variability include software product lines, configuration wizards and tools in commercial software, configuration interfaces of software components, or the dynamic runtime composition of

  8. ISAC's Gating-ML 2.0 data exchange standard for gating description.

    Science.gov (United States)

    Spidlen, Josef; Moore, Wayne; Brinkman, Ryan R

    2015-07-01

    The lack of software interoperability with respect to gating has traditionally been a bottleneck preventing the use of multiple analytical tools and reproducibility of flow cytometry data analysis by independent parties. To address this issue, ISAC developed Gating-ML, a computer file format to encode and interchange gates. Gating-ML 1.5 was adopted and published as an ISAC Candidate Recommendation in 2008. Feedback during the probationary period from implementors, including major commercial software companies, instrument vendors, and the wider community, has led to a streamlined Gating-ML 2.0. Gating-ML has been significantly simplified and therefore easier to support by software tools. To aid developers, free, open source reference implementations, compliance tests, and detailed examples are provided to stimulate further commercial adoption. ISAC has approved Gating-ML as a standard ready for deployment in the public domain and encourages its support within the community as it is at a mature stage of development having undergone extensive review and testing, under both theoretical and practical conditions. © 2015 International Society for Advancement of Cytometry.

  9. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  10. VMStools: Open-source software for the processing, analysis and visualization of fisheries logbook and VMS data

    DEFF Research Database (Denmark)

    Hintzen, Niels T.; Bastardie, Francois; Beare, Doug

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook...... fishing from other activities, provide high-resolution maps of both fishing effort and -landings, interpolate vessel tracks, calculate indicators of fishing impact as listed under the Data Collection Framework at different spatio-temporal scales. Finally data can be transformed into other existing formats......, for example to populate regional databases like FishFrame. This paper describes workflow examples of these features while online material allows a head start to perform these analyses. This software incorporates state-of-the art VMS and logbook analysing methods standardizing the process towards obtaining pan...

  11. Increasing software testability with standard access and control interfaces

    Science.gov (United States)

    Nikora, Allen P; Some, Raphael R.; Tamir, Yuval

    2003-01-01

    We describe an approach to improving the testability of complex software systems with software constructs modeled after the hardware JTAG bus, used to provide visibility and controlability in testing digital circuits.

  12. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  13. On the Ambiguity of Commercial Open Source

    Directory of Open Access Journals (Sweden)

    Lucian Luca

    2006-01-01

    Full Text Available . Open source and commercial applications used to be two separate worlds. The former was the work of amateurs who had little interest in making a profit, while the latter was only profit oriented and was produced by big companies. Nowadays open source is a threat and an opportunity to serious businesses of all kinds, generating good profits while delivering low costs products to customers. The competition between commercial and open source software has impacted the industry and the society as a whole. But in the last years, the markets for commercial and open source software are converging rapidly and it is interesting to resume and discuss the implications of this new paradigm, taking into account arguments pro and against it.

  14. Free software and open source databases

    Directory of Open Access Journals (Sweden)

    Napoleon Alexandru SIRITEANU

    2006-01-01

    Full Text Available The emergence of free/open source software -FS/OSS- enterprises seeks to push software development out of the academic stream into the commercial mainstream, and as a result, end-user applications such as open source database management systems (PostgreSQL, MySQL, Firebird are becoming more popular. Companies like Sybase, Oracle, Sun, IBM are increasingly implementing open source strategies and porting programs/applications into the Linux environment. Open source software is redefining the software industry in general and database development in particular.

  15. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  16. Assessing the Army’s Software Patch Management Process

    Science.gov (United States)

    2016-03-04

    software maker or to antivirus vendors (Zetter, 2014). Fixing such a vulnerability within the zero-day period requires teamwork across multiple...Assessing the Army’s Software Patch Management Process Benjamin Alan Pryor March 4, 2016 PUBLISHED...19 Commercial-Off-the-Shelf Software

  17. cPath: open source software for collecting, storing, and querying biological pathways

    Directory of Open Access Journals (Sweden)

    Gross Benjamin E

    2006-11-01

    Full Text Available Abstract Background Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. Results We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. Conclusion cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling.

  18. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  19. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    Science.gov (United States)

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  20. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  1. Developing evidence-based prescriptive ventilation rate standards for commercial buildings in California: a proposed framework

    Energy Technology Data Exchange (ETDEWEB)

    Mendell, Mark J.; Fisk, William J.

    2014-02-01

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effects associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each

  2. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Directory of Open Access Journals (Sweden)

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  3. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  4. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  5. CARDS: A blueprint and environment for domain-specific software reuse

    Science.gov (United States)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  6. Modernization of tank floor scanning system (TAFLOSS) software

    International Nuclear Information System (INIS)

    Mohd Fitri Abdul Rahman; Jaafar Abdullah; Susan Maria Sipaun

    2002-01-01

    Tank Floor Scanning System (TAFLOSS) is a portable nucleonic device based on the scattering and moderation phenomena of neutrons. TAFLOSS, which was developed by MINT, can precisely and non-destructively measure the gap and hydrogen content in the foundation of a gigantic industrial tank in a practical and cost-effective manner. In recording and analysing measured data, three different computer software were used. In analysing the initial data, a Disk Operating System (DOS) based software called MesTank 3.0 have been developed. The system also used commercial software such as Table Curve 2D and SURFER for graphics purposes. Table Curve 2D was used to plot and evaluate curve fitting, whereas SURFER software used to draw contours. It is not user friendly and time consuming to switch from a software to another software for different tasks of this system. Therefore, the main objective of the project is to develop new user-friendly software that combined the old and commercial software into a single package. The computer programming language that was used to develop the software is Microsoft Visual C++ ver. 6.0. The process of developing this software involved complex mathematical calculation, curve fitting and contour plot. This paper describes the initial development of a computer programme for analysing the initial data and plotting exponential curve fitting. (Author)

  7. Technical Support Document for Version 3.9.0 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, R. G.; Richman, Eric E.; Schultz, Ralph W.; Winiarski, David W.

    2011-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC are no longer included, but those sections remain in this document for reference purposes.

  8. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  9. Software piracy: A study of causes, effects and preventive measures

    OpenAIRE

    Khadka, Ishwor

    2015-01-01

    Software piracy is a serious issue that has been affecting software companies for decades. According to Business Software Alliance (BSA), the global software piracy rate in 2013 was 43 percent and the commercial value of unlicensed software installations was $62.7 billion, which resulted in millions of revenues and jobs lost in software companies. The goal of this study was to better understand the software piracy behaviours, how it happens, how it affects to individuals and software compani...

  10. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  11. Speech to Text Software Evaluation Report

    CERN Document Server

    Martins Santo, Ana Luisa

    2017-01-01

    This document compares out-of-box performance of three commercially available speech recognition software: Vocapia VoxSigma TM , Google Cloud Speech, and Lime- craft Transcriber. It is defined a set of evaluation criteria and test methods for speech recognition softwares. The evaluation of these softwares in noisy environments are also included for the testing purposes. Recognition accuracy was compared using noisy environments and languages. Testing in ”ideal” non-noisy environment of a quiet room has been also performed for comparison.

  12. Assessing the Content and Quality of Commercially Available Reading Software Programs: Do They Have the Fundamental Structures to Promote the Development of Early Reading Skills in Children?

    Science.gov (United States)

    Grant, Amy; Wood, Eileen; Gottardo, Alexandra; Evans, Mary Ann; Phillips, Linda; Savage, Robert

    2012-01-01

    The current study developed a taxonomy of reading skills and compared this taxonomy with skills being trained in 30 commercially available software programs designed to teach emergent literacy or literacy-specific skills for children in preschool, kindergarten, and Grade 1. Outcomes suggest that, although some skills are being trained in a…

  13. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  14. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  15. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  16. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  17. An independent monitor unit calculation by commercial software as a part of a radiotherapy treatment planning system quality control

    International Nuclear Information System (INIS)

    Nechvil, K.; Mynarik, J.

    2014-01-01

    For the independent calculation of the monitored unit (MU) the commercial software RadCalc (Lifeline Software Inc., Tyler TX) was used as the choice of some available similar programs. The program was configured and used to verify the doses calculated by commercially accessible planning system Eclipse version 8.6.17 (Varian Medical System Inc., Palo Alto). This system is being used during the clinical running for the creation of the treatment plans. The results of each plan were compared to the dose phantom measurements by the ionization chamber at the same point in which the calculation were done (Eclipse, RadCalc) - in the izocentre. TPS is configured by the beam data (PDD and OAR). Those beam data were exported and afterwards the same data were imported to the program RadCalc. The consistent and independent data between TPS and RadCalc were gained by this process. The reference conditions were set the identical in RadCalc as in TPS, so the consistency between TPS and RadCalc output factors has been achieved (Collimator Scatter Factor: Sc, Phantom Scatter Factor: Sp). Those output factors were also measured by the ionizing chamber in the water phantom and compared with the TPS. Based on the clinical data of the response to the doses, ICRU recommends ensuring the ability of dosimetric systems to deliver the doses with accuracy of at least 5%. Many factors, such as layout of anatomic structures, positioning of a patient, factors related to an accelerator (a dose calibration and mechanic parameters) cause random and systematic failures in a dose delivery. The source of some problems can be also caused by the system databases and relating information transfer; and the TPS containing besides other things other dose calculation algorithms. (authors)

  18. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    International Nuclear Information System (INIS)

    Lopez-Rendon, X.; Bosmans, H.; Zanca, F.; Oyen, R.

    2015-01-01

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  19. Effective dose and organ doses estimation taking tube current modulation into account with a commercial software package

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rendon, X. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Bosmans, H.; Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Oyen, R. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium)

    2015-07-15

    To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)

  20. Aircraft Design Software

    Science.gov (United States)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  1. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  2. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    Science.gov (United States)

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard

  3. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  4. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    developed products. The above definition was derived from these references: [IEEE-CS 2008] ISO /IEC 12207 , IEEE Std 12207 -2008, Systems and Software...Systems [CNSS 2009]. Software quality Capability of a software product to satisfy stated and implied needs when used under specified conditions [ ISO ...Curriculum ISO International Organization for Standardization IT information technology KA knowledge area KU knowledge unit MBA Master of

  5. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  6. Comparison of a commercial blood cross-matching kit to the standard laboratory method for establishing blood transfusion compatibility in dogs.

    Science.gov (United States)

    Guzman, Leo Roa; Streeter, Elizabeth; Malandra, Allison

    2016-01-01

    To evaluate the accuracy of a commercial blood transfusion cross-match kit when compared to the standard laboratory method for establishing blood transfusion compatibility. A prospective observational in intro study performed from July 2009 to July 2013. Private referral veterinary center. Ten healthy dogs, 11 anemic dogs, and 24 previously transfused dogs. None. Forty-five dogs were enrolled in a prospective study in order to compare the standard blood transfusion cross-match technique to a commercial blood transfusion cross-matching kit. These dogs were divided into 3 different groups that included 10 healthy dogs (control group), 11 anemic dogs in need of a blood transfusion, and 24 sick dogs that were previously transfused. Thirty-five dogs diagnosed with anemia secondary to multiple disease processes were cross-matched using both techniques. All dogs cross-matched via the kit had a compatible major and minor result, whereas 16 dogs out of 45 (35%) had an incompatible cross-match result when the standard laboratory technique was performed. The average time to perform the commercial kit was 15 minutes and this was 3 times shorter than the manual cross-match laboratory technique that averaged 45-50 minutes to complete. While the gel-based cross-match kit is quicker and less technically demanding than standard laboratory cross-match procedures, microagglutination and low-grade hemolysis are difficult to identify by using the gel-based kits. This could result in transfusion reactions if the gel-based kits are used as the sole determinant of blood compatibility prior to transfusion. Based on our results, the standard manual cross-match technique remains the gold standard test to determine blood transfusion compatibility. © Veterinary Emergency and Critical Care Society 2016.

  7. Professional Issues In Software Engineering

    CERN Document Server

    Bott, Frank; Eaton, Jack; Rowland, Diane

    2000-01-01

    An comprehensive text covering all the issues that software engineers now have to take into account apart from the technical side of things. Includes information on the legal, professional and commercial context in which they work.

  8. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  9. Technical Support Document for Version 3.9.1 of the COMcheck Software

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan; Halverson, Mark A.; Lucas, Robert G.; Richman, Eric E.; Schultz, Robert W.; Winiarski, David W.

    2012-09-01

    COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989 and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards. Beginning with COMcheck version 3.8.0, support for 90.1-1989, 90.1-1999, and the 1998 IECC and version 3.9.0 support for 2000 and 2001 IECC are no longer included, but those sections remain in this document for reference purposes.

  10. 7 CFR 51.2278 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.2278 Section 51.2278 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Standards for Shelled English Walnuts (Juglans Regia) Grades § 51.2278 U.S. Commercial. “U.S. Commercial...

  11. Health software: a new CEI Guide for software management in medical environment.

    Science.gov (United States)

    Giacomozzi, Claudia; Martelli, Francesco

    2016-01-01

    The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.

  12. Simple method for the determination of rosiglitazone in human plasma using a commercially available internal standard.

    Science.gov (United States)

    Mamidi, Rao N V S; Benjamin, Biju; Ramesh, Mullangi; Srinivas, Nuggehally R

    2003-09-01

    To the best of our knowledge, bioanalytical methods to determine rosiglitazone in human plasma reported in literature use internal standards that are not commercially available. Our purpose was to develop a simple method for the determination of rosiglitazone in plasma employing a commercially available internal standard (IS). After the addition of celecoxib (IS), plasma (0.25 mL) samples were extracted into ethyl acetate. The residue after evaporation of the organic layer was dissolved in 750 microL of mobile phase and 50 microL was injected on to HPLC. The separation was achieved using a Hichrom KR 100, 250 x 4.6 mm C(18) with a mobile phase composition potassium dihydrogen phosphate buffer (0.01 m, pH 6.5):acetonitrile:methanol (40:50:10, v/v/v). The flow-rate of the mobile phase was set at 1 mL/min. The column eluate was monitored by fluorescence detector set at an excitation wavelength of 247 nm and emission wavelength of 367 nm. Linear relationships (r(2) > 0.99) were observed between the peak area ratio rosiglitazone to IS vs rosiglitazone concentrations across the concentration range 5-1000 ng/mL. The intra-run precision (%RSD) and accuracy (%Dev) in the measurement of rosiglitazone were 80% for both rosiglitazone and IS from human plasma. The lower limit of quantitation of the assay was 5 ng/mL. In summary, the methodology for rosiglitazone measurement in plasma was simple, sensitive and employed a commercially available IS. Copyright 2003 John Wiley & Sons, Ltd.

  13. Molecular radiotherapy: The NUKFIT software for calculating the time-integrated activity coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, P.; Schimmel, S.; Luster, M. [Klinik für Nuklearmedizin, Universität Ulm, Ulm 89081 (Germany); Kestler, H. A. [Research Group Bioinformatics and Systems Biology, Institut für Neuroinformatik, Universität Ulm, Ulm 89081 (Germany); Hänscheid, H.; Fernández, M.; Lassmann, M. [Klinik für Nuklearmedizin, Universität Würzburg, Würzburg 97080 (Germany); Bröer, J. H.; Nosske, D. [Bundesamt für Strahlenschutz, Fachbereich Strahlenschutz und Gesundheit, Oberschleißheim 85764 (Germany); Glatting, G. [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Heidelberg University, Mannheim 68167 (Germany)

    2013-10-15

    Purpose: Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error.Methods: The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB.Results: To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit

  14. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    OpenAIRE

    Cevdet Kızıl; Ayşe Tansel Çetin; Ahmed Bulunmaz

    2014-01-01

    The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commerc...

  15. Evolving software reengineering technology for the emerging innovative-competitive era

    Science.gov (United States)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex

  16. Anticipatory Standards and the Commercialization of Nanotechnology

    International Nuclear Information System (INIS)

    Rashba, Edward; Gamota, Daniel

    2003-01-01

    Standardization will play an increasing role in creating a smooth transition from the laboratory to the marketplace as products based on nanotechnology are developed and move into broad use. Traditionally, standards have evolved out of a need to achieve interoperability among existing products, create order in markets, simplify production and ensure safety. This view does not account for the escalating trend in standardization, especially in emerging technology sectors, in which standards working groups anticipate the evolution of a technology and facilitate its rapid development and entree to the market place. It is important that the nanotechnology community views standards as a vital tool to promote progress along the nanotechnology value chain - from nanoscale materials that form the building blocks for components and devices to the integration of these devices into functional systems.This paper describes the need for and benefits derived from developing consensus standards in nanotechnology, and how standards are created. Anticipatory standards can nurture the growth of nanotechnology by drawing on the lessons learned from a standards effort that has and continues to revolutionize the telecommunications industry. Also, a brief review is presented on current efforts in the US to create nanotechnology standards

  17. lessons and challenges from software quality assessment

    African Journals Online (AJOL)

    DJFLEX

    www.globaljournalseries.com, Email: info@globaljournalseries.com ... ASSESSMENT: THE CASE OF SPACE SYSTEMS SOFTWARE. ... KEYWORDS: Software, Software Quality ,Quality Standard, Characteristics, ... and communication, etc.

  18. Development of Radio Frequency Antenna Radiation Simulation Software

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Rozaimah Abd Rahim; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2014-01-01

    Antennas are widely used national wide for radio frequency propagation especially for communication system. Radio frequency is electromagnetic spectrum from 10 kHz to 300 GHz and non-ionizing. These radiation exposures to human being have radiation hazard risk. This software was under development using LabVIEW for radio frequency exposure calculation. For the first phase of this development, software purposely to calculate possible maximum exposure for quick base station assessment, using prediction methods. This software also can be used for educational purpose. Some results of this software are comparing with commercial IXUS and free ware NEC software. (author)

  19. An evaluation of the impact of state Renewable Portfolio Standards (RPS) on retail, commercial, and industrial electricity prices

    Science.gov (United States)

    Puram, Rakesh

    The Renewable Portfolio Standard (RPS) has become a popular mechanism for states to promote renewable energy and its popularity has spurred a potential bill within Congress for a nationwide Federal RPS. While RPS benefits have been touted by several groups, it also has detractors. Among the concerns is that RPS standards could raise electricity rates, given that renewable energy is costlier than traditional fossil fuels. The evidence on the impact of RPS on electricity prices is murky at best: Complex models by NREL and USEIA utilize computer programs with several assumptions which make empirical studies difficult and only predict slight increases in electricity rates associated with RPS standards. Recent theoretical models and empirical studies have found price increases, but often fail to comprehensively include several sets of variables, which in fact could confound results. Utilizing a combination of past papers and studies to triangulate variables this study aims to develop both a rigorous fixed effects regression model as well as a theoretical framework to explain the results. This study analyzes state level panel data from 2002 to 2008 to analyze the effect of RPS on residential, commercial, and industrial electricity prices, controlling for several factors including amount of electricity generation from renewable and non-renewable sources, customer incentives for renewable energy, macroeconomic and demographic indicators, and fuel price mix. The study contrasts several regressions to illustrate important relationships and how inclusions as well as exclusion of various variables have an effect on electricity rates. Regression results indicate that the presence of RPS within a state increases the commercial and residential electricity rates, but have no discernable effect on the industrial electricity rate. Although RPS tends to increase electricity prices, the effect has a small impact on higher electricity prices. The models also indicate that jointly all

  20. Experiences on dynamic simulation software in chemical engineering education

    DEFF Research Database (Denmark)

    Komulainen, Tiina M.; Enemark-rasmussen, Rasmus; Sin, Gürkan

    2012-01-01

    Commercial process simulators are increasing interest in the chemical engineer education. In this paper, the use of commercial dynamic simulation software, D-SPICE® and K-Spice®, for three different chemical engineering courses is described and discussed. The courses cover the following topics...

  1. Track counting and thickness measurement of LR115 radon detectors using a commercial image scanner

    International Nuclear Information System (INIS)

    De Cicco, F.; Pugliese, M.; Roca, V.; Sabbarese, C.

    2014-01-01

    An original optical method for track counting and film thickness determination of etched LR115 radon detectors was developed. The method offers several advantages compared with standard techniques. In particular, it is non-destructive, very simple and rather inexpensive, since it uses a commercial scanner and a free software. The complete analysis and the calibration procedure carried out for the determination of radon specific activity are reported. A comparison with the results of spark counting defines the accuracy and the precision of the new technique. (authors)

  2. Building Energy Management Open Source Software

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Saifur [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States)

    2017-08-25

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they are not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.

  3. Open Source Software in Medium Size Organizations: Key Factors for Adoption

    Science.gov (United States)

    Solomon, Jerry T.

    2010-01-01

    For-profit organizations are constantly evaluating new technologies to gain competitive advantage. One such technology, application software, has changed significantly over the past 25 years with the introduction of Open Source Software (OSS). In contrast to commercial software that is developed by private companies and sold to organizations, OSS…

  4. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  5. An evaluation and acceptance of COTS software for FPGA-based controllers in NPPS

    International Nuclear Information System (INIS)

    Jung, Sejin; Kim, Eui-Sub; Yoo, Junbeom; Kim, Jang-Yeol; Choi, Jong Gyun

    2016-01-01

    Highlights: • All direct/indirect COTS SW should be dedicated. • FPGA synthesis tools are important for the safety of new digital I&Cs. • No standards/reports are yet available to deal with the indirect SW – FPGA synthesis tools. • This paper proposes a new evaluation/acceptance process and criteria for indirect SW. - Abstract: FPGA (Field-Programmable Gate Array) has received much attention from nuclear industry as an alternative platform of PLC (Programmable Logic Controller)-based digital I&C (Instrumentation & Control). Software aspect of FPGA development encompasses several commercial tools such as logic synthesis and P&R (Place & Route), which should be first dedicated in accordance with domestic standards based on EPRI NP-5652. Even if a state-of-the-art supplementary EPRI TR-1025243 makes an effort, the dedication of indirect COTS (Commercial Off-The-Shelf) SW such as FPGA logic synthesis tools has still caused a dispute. This paper proposes an acceptance process and evaluation criteria, specific to COTS SW, not commercial-grade direct items. It specifically incorporates indirect COTS SW and also provides categorized evaluation criteria for acceptance. It provides an explicit linkage between acceptance methods (Verification and Validation techniques) and evaluation criteria, too. We tried to perform the evaluation and acceptance process upon a commercial FPGA logic synthesis tool being used to develop a new FPGA-based digital I&C in Korea, and could confirm its applicability.

  6. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  7. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  8. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    Science.gov (United States)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  9. Commercial and Industrial Solid Waste Incineration Units (CISWI): New Source Performance Standards (NSPS) and Emission Guidelines (EG) for Existing Sources

    Science.gov (United States)

    Learn about the New Source Performance Standards (NSPS) for commercial and industrial solid waste incineration (CISWI) units including emission guidelines and compliance times for the rule. Read the rule history and summary, and find supporting documents

  10. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    International Nuclear Information System (INIS)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-01-01

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed

  11. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer.

    Science.gov (United States)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-09-18

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  12. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    Directory of Open Access Journals (Sweden)

    La Macchia Mariangela

    2012-09-01

    Full Text Available Abstract Purpose To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Methods and materials Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT images, one replanning CT (rCT image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs. We used three software solutions (VelocityAI 2.6.2 (V, MIM 5.1.1 (M by MIMVista and ABAS 2.0 (A by CMS-Elekta to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC were successively corrected manually. We recorded the time needed for: 1 ex novo ROIs definition on rCT; 2 generation of AC by the three software solutions; 3 manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE, sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z from the isocenter. Results The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate, A and M (contours for H&N, and M (contours for mesothelioma. Conclusions From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  13. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  14. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  15. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  16. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  17. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  18. ANSI/ASHRAE/IES Standard 90.1-2010 Performance Rating Method Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rosenberg, Michael I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-01

    This document is intended to be a reference manual for the Appendix G Performance Rating Method (PRM) of ANSI/ASHRAE/IES Standard 90.1- 2010 (Standard 90.1-2010).The PRM is used for rating the energy efficiency of commercial and high-rise residential buildings with designs that exceed the requirements of Standard 90.1. The procedures and processes described in this manual are designed to provide consistency and accuracy by filling in gaps and providing additional details needed by users of the PRM. It should be noted that this document is created independently from ASHRAE and SSPC 90.1 and is not sanctioned nor approved by either of those entities . Potential users of this manual include energy modelers, software developers and implementers of “beyond code” energy programs. Energy modelers using ASHRAE Standard 90.1-2010 for beyond code programs can use this document as a reference manual for interpreting requirements of the Performance Rating method. Software developers, developing tools for automated creation of the baseline model can use this reference manual as a guideline for developing the rules for the baseline model.

  19. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    Science.gov (United States)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  20. R&D to Market Success: BTO-Supported Technologies Commercialized from 2010-2015

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-04-01

    Technology commercialization plays an essential role in almost every facet of the U.S. economy. It spurs private sector funding that supports innovative breakthroughs, drives growth through increased productivity and product development, increases American competitiveness, and creates domestic jobs. The BTO Technology Commercialization report is an annual publication offering the latest information on successfully commercialized technologies resulting in part from BTO’s research partnerships. This report defines a “commercialized technology” as a process, technique, design, machine, tool, material, or software that was developed with funds provided at least in part by BTO, and that has resulted in domestic sales or is in use in the U.S. This definition also applies to open-source software products developed with support from BTO, all of which are currently distributed freely but are actively used for commercial purposes.

  1. Gammasphere software development. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  2. Standardization of software application development and governance

    OpenAIRE

    Labbe, Peter P.

    2015-01-01

    Approved for public release; distribution is unlimited A number of Defense Department initiatives focus on how to engineer better systems that directly influence software architecture, including Open Architecture, Enterprise Architecture, and Joint Information Enterprise. Additionally, the Department of Defense (DOD) mandates moving applications to consolidated datacenters and cloud computing. When examined from an application development perspective, the DOD lacks a common approach for in...

  3. Software qualification in safety applications

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    2000-01-01

    The developers of safety-critical instrumentation and control systems must qualify the design of the components used, including the software in the embedded computer systems, in order to ensure that the component can be trusted to perform its safety function under the full range of operating conditions. There are well known ways to qualify analog systems using the facts that: (1) they are built from standard modules with known properties; (2) design documents are available and described in a well understood language; (3) the performance of the component is constrained by physics; and (4) physics models exist to predict the performance. These properties are not generally available for qualifying software, and one must fall back on extensive testing and qualification of the design process. Neither of these is completely satisfactory. The research reported here is exploring an alternative approach that is intended to permit qualification for an important subset of instrumentation software. The research goal is to determine if a combination of static analysis and limited testing can be used to qualify a class of simple, but practical, computer-based instrumentation components for safety application. These components are of roughly the complexity of a motion detector alarm controller. This goal is accomplished by identifying design constraints that enable meaningful analysis and testing. Once such design constraints are identified, digital systems can be designed to allow for analysis and testing, or existing systems may be tested for conformance to the design constraints as a first step in a qualification process. This will considerably reduce the cost and monetary risk involved in qualifying commercial components for safety-critical service

  4. Engineering high quality medical software

    CERN Document Server

    Coronato, Antonio

    2018-01-01

    This book focuses on high-confidence medical software in the growing field of e-health, telecare services and health technology. It covers the development of methodologies and engineering tasks together with standards and regulations for medical software.

  5. Verification and software validation for nuclear instrumentation; Verificacion y validacion de software para instrumentacion nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Gaytan G, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Salgado G, J. R. [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico); De Andrade O, E. [Universidad Federal de Rio de Janeiro, Caixa Postal 68509, 21945-970 Rio de Janeiro (Brazil); Ramirez G, A., E-mail: elvira.gaytan@inin.gob.mx [Comision Federal de Electricidad, Gerencia de Centrales Nucleoelectricas, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    In this work is presented a Verification Methodology and Software Validation, to be applied in instruments of nuclear use with associate software. This methodology was developed under the auspices of IAEA, through the regional projects RLA4022 (ARCAL XCIX) and RLA1011 (RLA CXXIII), led by Mexico. In the first project three plans and three procedures were elaborated taking into consideration IEEE standards, and in the second project these documents were updated considering ISO and IEC standards. The developed methodology has been distributed to the participant countries of Latin America in the ARCAL projects and two related courses have been imparted with the participation of several countries, and participating institutions of Mexico like Instituto Nacional de Investigaciones Nucleares (ININ), Comision Federal de Electricidad (CFE) and Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). In the ININ due to the necessity to work with Software Quality Guarantee in systems for the nuclear power plant of the CFE, a Software Quality Guarantee Plan and five procedures were developed in the year 2004, obtaining the qualification of the ININ for software development for the nuclear power plant of CFE. These first documents were developed taking like reference IEEE standards and regulator guides of NRC, being the first step for the development of the methodology. (Author)

  6. KAERI software safety guideline for developing safety-critical software in digital instrumentation and control system of nuclear power plant

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Kim, Jang Yeol; Eum, Heung Seop.

    1997-07-01

    Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organization. The requirements for software important to safety of nuclear reactor are described in such positions and standards. Most of them are describing mandatory requirements, what shall be done, for the safety-critical software. The developers of such a software. However, there have been a lot of controversial factors on whether the work practices satisfy the regulatory requirements, and to justify the safety of such a system developed by the work practices, between the licenser and the licensee. We believe it is caused by the reason that there is a gap between the mandatory requirements (What) and the work practices (How). We have developed a guidance to fill such gap, which can be useful for both licenser and licensee to conduct a justification of the safety in the planning phase of developing the software for nuclear reactor protection systems. (author). 67 refs., 13 tabs., 2 figs

  7. KAERI software safety guideline for developing safety-critical software in digital instrumentation and control system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Kim, Jang Yeol; Eum, Heung Seop

    1997-07-01

    Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organization. The requirements for software important to safety of nuclear reactor are described in such positions and standards. Most of them are describing mandatory requirements, what shall be done, for the safety-critical software. The developers of such a software. However, there have been a lot of controversial factors on whether the work practices satisfy the regulatory requirements, and to justify the safety of such a system developed by the work practices, between the licenser and the licensee. We believe it is caused by the reason that there is a gap between the mandatory requirements (What) and the work practices (How). We have developed a guidance to fill such gap, which can be useful for both licenser and licensee to conduct a justification of the safety in the planning phase of developing the software for nuclear reactor protection systems. (author). 67 refs., 13 tabs., 2 figs.

  8. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  9. Software for Data Acquisition AMC Module with PCI Express Interface

    CERN Document Server

    Szachowalow, S; Makowski, D; Butkowski, L

    2010-01-01

    Free Electron Laser in Hamburg (FLASH) and XRay Free Electron Laser (XFEL) are linear accelerators that require a complex and accurate Low Level Radio Frequency (LLRF) control system. Currently working systems are based on aged Versa Module Eurocard (VME) architecture. One of the alternatives for the VME bus is the Advanced Telecommunications and Computing Architecture (ATCA) standard. The ATCA based LLRF controller mainly consists of a few ATCA carrier boards and several Advanced Mezzanine Cards (AMC). AMC modules are available in variety of functions such as: ADC, DAC, data storage, data links and even CPU cards. This paper focuses on the software that allows user to collect and plot the data from commercially available TAMC900 board.

  10. Robotic Software for the Thacher Observatory

    Science.gov (United States)

    Lawrence, George; Luebbers, Julien; Eastman, Jason D.; Johnson, John A.; Swift, Jonathan

    2018-06-01

    The Thacher Observatory—a research and educational facility located in Ojai, CA—uses a 0.7 meter telescope to conduct photometric research on a variety of targets including eclipsing binaries, exoplanet transits, and supernovae. Currently, observations are automated using commercial software. In order to expand the flexibility for specialized scientific observations and to increase the educational value of the facility on campus, we are adapting and implementing the custom observatory control software and queue scheduling developed for the Miniature Exoplanet Radial Velocity Array (MINERVA) to the Thacher Observatory. We present the design and implementation of this new software as well as its demonstrated functionality on the Thacher Observatory.

  11. Software And Systems Engineering Risk Management

    Science.gov (United States)

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  12. Technique of semiautomatic surface reconstruction of the visible Korean human data using commercial software.

    Science.gov (United States)

    Park, Jin Seo; Shin, Dong Sun; Chung, Min Suk; Hwang, Sung Bae; Chung, Jinoh

    2007-11-01

    This article describes the technique of semiautomatic surface reconstruction of anatomic structures using widely available commercial software. This technique would enable researchers to promptly and objectively perform surface reconstruction, creating three-dimensional anatomic images without any assistance from computer engineers. To develop the technique, we used data from the Visible Korean Human project, which produced digitalized photographic serial images of an entire cadaver. We selected 114 anatomic structures (skin [1], bones [32], knee joint structures [7], muscles [60], arteries [7], and nerves [7]) from the 976 anatomic images which were generated from the left lower limb of the cadaver. Using Adobe Photoshop, the selected anatomic structures in each serial image were outlined, creating a segmented image. The Photoshop files were then converted into Adobe Illustrator files to prepare isolated segmented images, so that the contours of the structure could be viewed independent of the surrounding anatomy. Using Alias Maya, these isolated segmented images were then stacked to construct a contour image. Gaps between the contour lines were filled with surfaces, and three-dimensional surface reconstruction could be visualized with Rhinoceros. Surface imperfections were then corrected to complete the three-dimensional images in Alias Maya. We believe that the three-dimensional anatomic images created by these methods will have widespread application in both medical education and research. 2007 Wiley-Liss, Inc

  13. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  14. 7 CFR 51.1435 - U.S. Commercial Pieces.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Pieces. 51.1435 Section 51.1435 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1435 U.S. Commercial Pieces. The...

  15. 7 CFR 51.1433 - U.S. Commercial Halves.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial Halves. 51.1433 Section 51.1433 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... STANDARDS) United States Standards for Grades of Shelled Pecans Grades § 51.1433 U.S. Commercial Halves. The...

  16. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  17. Simple solution to the medical instrumentation software problem

    Science.gov (United States)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  18. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    Science.gov (United States)

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment

  19. Control/interlock/display system for EBT-P using commercially-available hardware and firmware

    International Nuclear Information System (INIS)

    Schmitt, R.J.

    1983-01-01

    For the EBT-P project, alternative commercially-available hardware, software and firmware have been employed for control, interlock and data display functions. This paper describes the criteria and rationale used to select that commercial equipment and discusses the important features of the equipment chosen, especially programmable controllers. Additional discussion is centered on interface problems which are encountered upon attempts to integrate equipment from several vendors. Some solutions to these problems are discussed. Details of software and hardware performance during tests are presented. The extent to which the EBT-P hardware and software configuration addresses and resolves various issues is discussed. Several areas have been uncovered in which relatively slight improvements/modifications of commercial programmable controller firmware would significantly improve the capability of this type of hardware in fusion control applications. These improvements are discussed in detail

  20. Cone-beam micro-CT system based on LabVIEW software.

    Science.gov (United States)

    Ionita, Ciprian N; Hoffmann, Keneth R; Bednarek, Daniel R; Chityala, Ravishankar; Rudin, Stephen

    2008-09-01

    Construction of a cone-beam computed tomography (CBCT) system for laboratory research usually requires integration of different software and hardware components. As a result, building and operating such a complex system require the expertise of researchers with significantly different backgrounds. Additionally, writing flexible code to control the hardware components of a CBCT system combined with designing a friendly graphical user interface (GUI) can be cumbersome and time consuming. An intuitive and flexible program structure, as well as the program GUI for CBCT acquisition, is presented in this note. The program was developed in National Instrument's Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) graphical language and is designed to control a custom-built CBCT system but has been also used in a standard angiographic suite. The hardware components are commercially available to researchers and are in general provided with software drivers which are LabVIEW compatible. The program structure was designed as a sequential chain. Each step in the chain takes care of one or two hardware commands at a time; the execution of the sequence can be modified according to the CBCT system design. We have scanned and reconstructed over 200 specimens using this interface and present three examples which cover different areas of interest encountered in laboratory research. The resulting 3D data are rendered using a commercial workstation. The program described in this paper is available for use or improvement by other researchers.

  1. Bone histomorphometry using free and commonly available software.

    Science.gov (United States)

    Egan, Kevin P; Brennan, Tracy A; Pignolo, Robert J

    2012-12-01

    Histomorphometric analysis is a widely used technique to assess changes in tissue structure and function. Commercially available programs that measure histomorphometric parameters can be cost-prohibitive. In this study, we compared an inexpensive method of histomorphometry to a current proprietary software program. Image J and Adobe Photoshop(®) were used to measure static and kinetic bone histomorphometric parameters. Photomicrographs of Goldner's trichrome-stained femurs were used to generate black-and-white image masks, representing bone and non-bone tissue, respectively, in Adobe Photoshop(®) . The masks were used to quantify histomorphometric parameters (bone volume, tissue volume, osteoid volume, mineralizing surface and interlabel width) in Image J. The resultant values obtained using Image J and the proprietary software were compared and differences found to be statistically non-significant. The wide-ranging use of histomorphometric analysis for assessing the basic morphology of tissue components makes it important to have affordable and accurate measurement options available for a diverse range of applications. Here we have developed and validated an approach to histomorphometry using commonly and freely available software that is comparable to a much more costly, commercially available software program. © 2012 Blackwell Publishing Limited.

  2. Commercial counterboard for 10 ns software correlator for photon and fluorescence correlation spectroscopy

    Science.gov (United States)

    Molteni, Matteo; Ferri, Fabio

    2016-11-01

    A 10 ns time resolution, multi-tau software correlator, capable of computing simultaneous autocorrelation (A-A, B-B) and cross (A-B) correlation functions at count rates up to ˜10 MHz, with no data loss, has been developed in LabVIEW and C++ by using the National Instrument timer/counterboard (NI PCIe-6612) and a fast Personal Computer (PC) (Intel Core i7-4790 Processor 3.60 GHz ). The correlator works by using two algorithms: for large lag times (τ ≳ 1 μs), a classical time-mode scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ ≲ 1 μs a photon-mode (PM) scheme is adopted and the correlation function is retrieved from the sequence of the photon arrival times. Single auto- and cross-correlation functions can be processed online in full real time up to count rates of ˜1.8 MHz and ˜1.2 MHz, respectively. Two autocorrelation (A-A, B-B) and a cross correlation (A-B) functions can be simultaneously processed in full real time only up to count rates of ˜750 kHz. At higher count rates, the online processing takes place in a delayed modality, but with no data loss. When tested with simulated correlation data and latex spheres solutions, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  3. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  4. Development of a coppice planting machine to commercial standards

    Energy Technology Data Exchange (ETDEWEB)

    Turton, J.S.

    2000-07-01

    This report gives details of the development work carried out on the Turton Engineering Coppice Planting machine in order to commercially market it. The background to the machine which plants single rows of cuttings from rods is traced,, and previous development work, design work, production of sub-assemblies and the assembly of modules, inspection and assembly, static trials, and commercial planting are examined. Further machine developments, proving trials, and recommendations for further work are discussed. Appendices address relationships applicable to vertical planting, the Turton short rotation cultivation machine rod format, estimated prices and charges, and a list of main suppliers. (UK)

  5. SEL's Software Process-Improvement Program

    Science.gov (United States)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  6. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-06-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  7. Development and validation of a HPLC method for standardization of herbal and commercial extracts of Myrcia uniflora

    Directory of Open Access Journals (Sweden)

    Andrea N. de L. Batista

    2011-04-01

    Full Text Available Myrcia uniflora Barb. Rodr., Myrtaceae, popularly known as "pedra-hume-caá" in Brazil, is sold as dry extracts in capsules or as tinctures for the treatment of diabetes mellitus. Previous phytochemical studies on this species described the occurrence of the flavonoids mearnsitrin and myricitrin. In the present study, the chromatographic profiles of M. uniflora leaves and commercial extracts were determined using HPLC-PAD. Myricitrin was used as an external standard in the development and validation of the HPLC method. The proposed method is simple, rapid and reliable and can be successfully applied in industry for standardization of herbs and phytomedicines commercialised in Brazil as "pedra-hume-caá".

  8. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  9. BioContainers: an open-source and community-driven framework for software standardization

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  10. BioContainers: an open-source and community-driven framework for software standardization.

    Science.gov (United States)

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  11. Software quality in 1997

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C. [Software Productivity Research, Inc., Burlington, MA (United States)

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  12. A tool to include gamma analysis software into a quality assurance program.

    Science.gov (United States)

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  14. 7 CFR 51.2647 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.2647 Section 51.2647 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Standards for Grades for Sweet Cherries 1 Grades § 51.2647 U.S. Commercial. “U.S. Commercial” consists of...

  15. 7 CFR 51.1542 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.1542 Section 51.1542 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... Standards for Grades of Potatoes 1 Grades § 51.1542 U.S. Commercial. “U.S. Commercial” consists of potatoes...

  16. AR2, a novel automatic muscle artifact reduction software method for ictal EEG interpretation: Validation and comparison of performance with commercially available software [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Shennan Aibel Weiss

    2017-04-01

    Full Text Available Objective: To develop a novel software method (AR2 for reducing muscle contamination of ictal scalp electroencephalogram (EEG, and validate this method on the basis of its performance in comparison to a commercially available software method (AR1 to accurately depict seizure-onset location. Methods: A blinded investigation used 23 EEG recordings of seizures from 8 patients. Each recording was uninterpretable with digital filtering because of muscle artifact and processed using AR1 and AR2 and reviewed by 26 EEG specialists. EEG readers assessed seizure-onset time, lateralization, and region, and specified confidence for each determination. The two methods were validated on the basis of the number of readers able to render assignments, confidence, the intra-class correlation (ICC, and agreement with other clinical findings. Results: Among the 23 seizures, two-thirds of the readers were able to delineate seizure-onset time in 10 of 23 using AR1, and 15 of 23 using AR2 (p<0.01. Fewer readers could lateralize seizure-onset (p<0.05. The confidence measures of the assignments were low (probable-unlikely, but increased using AR2 (p<0.05. The ICC for identifying the time of seizure-onset was 0.15 (95% confidence interval (CI, 0.11-0.18 using AR1 and 0.26 (95% CI 0.21-0.30 using AR2.  The EEG interpretations were often consistent with behavioral, neurophysiological, and neuro-radiological findings, with left sided assignments correct in 95.9% (CI 85.7-98.9%, n=4 of cases using AR2, and 91.9% (77.0-97.5% (n=4 of cases using AR1. Conclusions: EEG artifact reduction methods for localizing seizure-onset does not result in high rates of interpretability, reader confidence, and inter-reader agreement. However, the assignments by groups of readers are often congruent with other clinical data. Utilization of the AR2 software method may improve the validity of ictal EEG artifact reduction.

  17. Use of other industry standards to facilitate the procurement and dedication of commercial-grade items

    International Nuclear Information System (INIS)

    Beard, R.L.; Rosch, F.C.; Sanwarwalla, M.H.; Tjernlund, R.M.

    1994-01-01

    Nuclear utilities are embarking on innovative approaches for reducing costs in all aspects of engineering, operation, maintenance, and procurement to produce power cheaply and efficiently and remain competitive with other power producers. In the area of procurement, utilities are increasingly obtaining commercial-grade items for use in safety-related applications. This trend is occurring because of lack of suppliers capable or willing to meet 10 CFR 21 and 10 CFR 50 App. B requirements and because of the absence of original equipment suppliers or the spiraling cost associated with procuring items' basic components safety-related from original suppliers. Utilities have been looking at ways to reduce procurement costs. One promising means to reduce costs is to utilize information provided in other nonnuclear industry standards regarding the specification, control, manufacture, and acceptance of the critical characteristics required of the item to perform its design function. A task force was instituted under the sponsorship of the Electric Power Research Institute to investigate the feasibility of using items manufactured to other industry standards in nuclear safety-related applications. This investigation looked at a broad spectrum of available industry standards pertaining to the design, function, manufacture, and testing of items and determined that some standards are more useful than others. This paper discusses the results of this investigation and how credit from the controls exercised for items manufactured to certain existing industry standards can be taken to minimize dedication costs

  18. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    Science.gov (United States)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  19. Cost Optimization Through Open Source Software

    Directory of Open Access Journals (Sweden)

    Mark VonFange

    2010-12-01

    Full Text Available The cost of information technology (IT as a percentage of overall operating and capital expenditures is growing as companies modernize their operations and as IT becomes an increasingly indispensable part of company resources. The price tag associated with IT infrastructure is a heavy one, and, in today's economy, companies need to look for ways to reduce overhead while maintaining quality operations and staying current with technology. With its advancements in availability, usability, functionality, choice, and power, free/libre open source software (F/LOSS provides a cost-effective means for the modern enterprise to streamline its operations. iXsystems wanted to quantify the benefits associated with the use of open source software at their company headquarters. This article is the outgrowth of our internal analysis of using open source software instead of commercial software in all aspects of company operations.

  20. Software quality assurance plan for GCS

    Science.gov (United States)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  1. Construction cost impact analysis of the U.S. Department of Energy mandatory performance standards for new federal commercial and multi-family, high-rise residential buildings

    International Nuclear Information System (INIS)

    Di Massa, F.V.; Hadley, D.L.; Halverson, M.A.

    1993-12-01

    In accordance with federal legislation, the U.S. Department of Energy (DOE) has conducted a project to demonstrate use of its Energy Conservation Voluntary Performance Standards for Commercial and Multi-Family High-Rise Residential Buildings; Mandatory for New Federal Buildings; Interim Rule (referred to in this report as DOE-1993). A key requisite of the legislation requires DOE to develop commercial building energy standards that are cost effective. During the demonstration project, DOE specifically addressed this issue by assessing the impacts of the standards on (1) construction costs, (2) builders (and especially small builders) of multi-family, high-rise buildings, and (3) the ability of low-to moderate-income persons to purchase or rent units in such buildings. This document reports on this project

  2. Ubuntuism, commodification, and the software dialectic

    OpenAIRE

    Chege, Mike

    2008-01-01

    “Free as in speech, but not free as in beer,” is the refrain made famous by Richard Stallman, the standard-bearer of the free software movement. However, many free software advocates seem to be of the opinion that the purity of free software is somehow tainted by any preoccupation with money or profit. Inevitably, this has implications for the economic sustainability of free software, for without a source of income, how can free software hope to survive? The challenge of finding a way to ensu...

  3. Hooke: an open software platform for force spectroscopy.

    Science.gov (United States)

    Sandal, Massimo; Benedetti, Fabrizio; Brucale, Marco; Gomez-Casado, Alberto; Samorì, Bruno

    2009-06-01

    Hooke is an open source, extensible software intended for analysis of atomic force microscope (AFM)-based single molecule force spectroscopy (SMFS) data. We propose it as a platform on which published and new algorithms for SMFS analysis can be integrated in a standard, open fashion, as a general solution to the current lack of a standard software for SMFS data analysis. Specific features and support for file formats are coded as independent plugins. Any user can code new plugins, extending the software capabilities. Basic automated dataset filtering and semi-automatic analysis facilities are included. Software and documentation are available at (http://code.google.com/p/hooke). Hooke is a free software under the GNU Lesser General Public License.

  4. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  5. An engineering context for software engineering

    OpenAIRE

    Riehle, Richard D.

    2008-01-01

    New engineering disciplines are emerging in the late Twentieth and early Twenty-first Century. One such emerging discipline is software engineering. The engineering community at large has long harbored a sense of skepticism about the validity of the term software engineering. During most of the fifty-plus years of software practice, that skepticism was probably justified. Professional education of software developers often fell short of the standard expected for conventional engineers; so...

  6. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    Science.gov (United States)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  7. Software components for medical image visualization and surgical planning

    Science.gov (United States)

    Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.

    2001-05-01

    Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been

  8. Data-Acquisition Software for PSP/TSP Wind-Tunnel Cameras

    Science.gov (United States)

    Amer, Tahani R.; Goad, William K.

    2005-01-01

    Wing-Viewer is a computer program for acquisition and reduction of image data acquired by any of five different scientificgrade commercial electronic cameras used at Langley Research center to observe wind-tunnel models coated with pressure or temperature-sensitive paints (PSP/TSP). Wing-Viewer provides full automation of camera operation and acquisition of image data, and has limited data-preprocessing capability for quick viewing of the results of PSP/TSP test images. Wing- Viewer satisfies a requirement for a standard interface between all the cameras and a single personal computer: Written by use of Microsoft Visual C++ and the Microsoft Foundation Class Library as a framework, Wing-Viewer has the ability to communicate with the C/C++ software libraries that run on the controller circuit cards of all five cameras.

  9. Instrumentation Standard Architectures for Future High Availability Control Systems

    International Nuclear Information System (INIS)

    Larsen, R.S.

    2005-01-01

    Architectures for next-generation modular instrumentation standards should aim to meet a requirement of High Availability, or robustness against system failure. This is particularly important for experiments both large and small mounted on production accelerators and light sources. New standards should be based on architectures that (1) are modular in both hardware and software for ease in repair and upgrade; (2) include inherent redundancy at internal module, module assembly and system levels; (3) include modern high speed serial inter-module communications with robust noise-immune protocols; and (4) include highly intelligent diagnostics and board-management subsystems that can predict impending failure and invoke evasive strategies. The simple design principles lead to fail-soft systems that can be applied to any type of electronics system, from modular instruments to large power supplies to pulsed power modulators to entire accelerator systems. The existing standards in use are briefly reviewed and compared against a new commercial standard which suggests a powerful model for future laboratory standard developments. The past successes of undertaking such projects through inter-laboratory engineering-physics collaborations will be briefly summarized

  10. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Illinois Jurisdictions

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.; Friedrich, Michele

    2002-05-01

    ASHRAE Standard 90.1-1999 was developed in an effort to set minimum requirements for energy efficienty design and construction of new commercial buildings. This report assesses the benefits and costs of adopting this standard as the building energy code in Illinois. Energy and economic impacts are estimated using BLAST combined with a Life-Cycle Cost approach to assess corresponding economic costs and benefits.

  11. Criteria for software modularization

    Science.gov (United States)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  12. Evaluation procedure of software safety plan for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4

  13. Evaluation procedure of software safety plan for digital I and C of KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4.

  14. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  15. SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with SMP Standard

    Directory of Open Access Journals (Sweden)

    Cheol-Hea Koo

    2010-12-01

    Full Text Available Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI are available. Korea Aerospace Research Institute (KARI is developing hardware abstraction layer (HAL supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.

  16. Cross-border software development of health information system: A case study on project between India and Pakistan based on open source software

    OpenAIRE

    Sabir, Uzma

    2017-01-01

    Global software development is a phenomenon that is receiving considerable interest from researchers during past two decades. Several challenges have been identified and approaches to deal with these challenges have been developed. Typically, western companies outsource their projects to countries where costs are lower and skilled professionals are easily available. Majority of these projects are developed for commercial purposes. However, software development projects between India and Pakis...

  17. Techniques and tools for software qualification in KNICS

    International Nuclear Information System (INIS)

    Cha, Kyung H.; Lee, Yeong J.; Cheon, Se W.; Kim, Jang Y.; Lee, Jang S.; Kwon, Kee C.

    2004-01-01

    This paper describes techniques and tools for qualifying safety software in Korea Nuclear Instrumentation and Control System (KNICS). Safety software are developed and applied for a Reactor Protection System (RPS), an Engineered Safety Features and Component Control System (ESF-CCS), and a safety Programmable Logic Controller (PLC) in the KNICS. Requirements and design specifications of safety software are written by both natural language and formal specification languages. Statechart is used for formal specification of software of the ESF-CCS and the safety PLC while NuSCR is used for formal specification of them of the RPS. pSET (POSCON Software Engineering Tool) as a software development tool has been developed and utilized for the IEC61131-3 based PLC programming. The qualification of the safety software consists of software verification and validation (V and V) through software life cycle, software safety analysis, and software configuration management, software quality assurance, and COTS (Commercial-Off-The-Shelf) dedication. The criteria and requirements for qualifying the safety software have been established with them in Software Review Plan (SRP)/Branch Technical Positions (BTP)-14, IEEE Std. 7-4.3.2-1998, NUREG/CR-6463, IEEE Std. 1012-1998, and so on. Figure 1 summarizes qualification techniques and tools for the safety software

  18. Recent trends on Software Verification and Validation Testing

    International Nuclear Information System (INIS)

    Kim, Hyungtae; Jeong, Choongheui

    2013-01-01

    Verification and Validation (V and V) include the analysis, evaluation, review, inspection, assessment, and testing of products. Especially testing is an important method to verify and validate software. Software V and V testing covers test planning to execution. IEEE Std. 1012 is a standard on the software V and V. Recently, IEEE Std. 1012-2012 was published. This standard is a major revision to IEEE Std. 1012-2004 which defines only software V and V. It expands the scope of the V and V processes to include system and hardware as well as software. This standard describes the scope of V and V testing according to integrity level. In addition, independent V and V requirement related to software V and V testing in IEEE 7-4.3.2-2010 have been revised. This paper provides a recent trend of software V and V testing by reviewing of IEEE Std. 1012-2012 and IEEE 7-4.3.2-2010. There are no major changes of software V and V testing activities and tasks in IEEE 1012-2012 compared with IEEE 1012-2004. But the positions on the responsibility to perform software V and V testing are changed. In addition IEEE 7-4.3.2-2010 newly describes the positions on responsibility to perform Software V and V Testing. However, the positions of these standards on the V and V testing are different. For integrity level 3 and 4, IEEE 1012-2012 basically requires that V and V organization shall conduct all of V and V testing tasks such as test plan, test design, test case, and test procedure except test execution. If V and V testing is conducted by not V and V but another organization, the results of that testing shall be analyzed by the V and V organization. For safety-related software, IEEE 7-4.3.2-2010 requires that test procedures and reports shall be independently verified by the alternate organization regardless of who writes the procedures and/or conducts the tests

  19. Static Checking of Interrupt-driven Software

    DEFF Research Database (Denmark)

    Brylow, Dennis; Damgaard, Niels; Palsberg, Jens

    2001-01-01

    at the assembly level. In this paper we present the design and implementation of a static checker for interrupt-driven Z86-based software with hard real-time requirements. For six commercial microcontrollers, our checker has produced upper bounds on interrupt latencies and stack sizes, as well as verified...

  20. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    Science.gov (United States)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  1. AN IMPROVED COCOMO SOFTWARE COST ESTIMATION MODEL

    African Journals Online (AJOL)

    DJFLEX

    developmental effort favourable to both software developers and customers, a standard effort multiplication factor(er) is introduced, to ... for recent changes in software engineering technology. The COCOMO .... application composition utilities.

  2. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-06-18

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented.

  3. SIMPLIFIED CHARGED PARTICLE BEAM TRANSPORT MODELING USING COMMONLY AVAILABLE COMMERCIAL SOFTWARE

    International Nuclear Information System (INIS)

    D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant

    2007-01-01

    Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented

  4. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    Science.gov (United States)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  5. Evaluations of UltraiQ software for objective ultrasound image quality assessment using images from a commercial scanner.

    Science.gov (United States)

    Long, Zaiyang; Tradup, Donald J; Stekel, Scott F; Gorny, Krzysztof R; Hangiandreou, Nicholas J

    2018-03-01

    We evaluated a commercially available software package that uses B-mode images to semi-automatically measure quantitative metrics of ultrasound image quality, such as contrast response, depth of penetration (DOP), and spatial resolution (lateral, axial, and elevational). Since measurement of elevational resolution is not a part of the software package, we achieved it by acquiring phantom images with transducers tilted at 45 degrees relative to the phantom. Each measurement was assessed in terms of measurement stability, sensitivity, repeatability, and semi-automated measurement success rate. All assessments were performed on a GE Logiq E9 ultrasound system with linear (9L or 11L), curved (C1-5), and sector (S1-5) transducers, using a CIRS model 040GSE phantom. In stability tests, the measurements of contrast, DOP, and spatial resolution remained within a ±10% variation threshold in 90%, 100%, and 69% of cases, respectively. In sensitivity tests, contrast, DOP, and spatial resolution measurements followed the expected behavior in 100%, 100%, and 72% of cases, respectively. In repeatability testing, intra- and inter-individual coefficients of variations were equal to or less than 3.2%, 1.3%, and 4.4% for contrast, DOP, and spatial resolution (lateral and axial), respectively. The coefficients of variation corresponding to the elevational resolution test were all within 9.5%. Overall, in our assessment, the evaluated package performed well for objective and quantitative assessment of the above-mentioned image qualities under well-controlled acquisition conditions. We are finding it to be useful for various clinical ultrasound applications including performance comparison between scanners from different vendors. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  6. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  7. Using Software Architectures for Designing Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    In this paper, we outline an on-going project of designing distributed embedded systems for closed-loop process control. The project is a joint effort between software architecture researchers and developers from two companies that produce commercial embedded process control systems. The project...... has a strong emphasis on software architectural issues and terminology in order to envision, design and analyze design alternatives. We present two results. First, we outline how focusing on software architecture, architectural issues and qualities are beneficial in designing distributed, embedded......, systems. Second, we present two different architectures for closed-loop process control and discuss benefits and reliabilities....

  8. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  9. Development, analysis, and evaluation of a commercial software framework for the study of Extremely Low Probability of Rupture (xLPR) events at nuclear power plants.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinich, Donald A.; Helton, Jon Craig; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-12-01

    Sandia National Laboratories (SNL) participated in a Pilot Study to examine the process and requirements to create a software system to assess the extremely low probability of pipe rupture (xLPR) in nuclear power plants. This project was tasked to develop a prototype xLPR model leveraging existing fracture mechanics models and codes coupled with a commercial software framework to determine the framework, model, and architecture requirements appropriate for building a modular-based code. The xLPR pilot study was conducted to demonstrate the feasibility of the proposed developmental process and framework for a probabilistic code to address degradation mechanisms in piping system safety assessments. The pilot study includes a demonstration problem to assess the probability of rupture of DM pressurizer surge nozzle welds degraded by primary water stress-corrosion cracking (PWSCC). The pilot study was designed to define and develop the framework and model; then construct a prototype software system based on the proposed model. The second phase of the project will be a longer term program and code development effort focusing on the generic, primary piping integrity issues (xLPR code). The results and recommendations presented in this report will be used to help the U.S. Nuclear Regulatory Commission (NRC) define the requirements for the longer term program.

  10. SEER Data & Software

    Science.gov (United States)

    Options for accessing datasets for incidence, mortality, county populations, standard populations, expected survival, and SEER-linked and specialized data. Plus variable definitions, documentation for reporting and using datasets, statistical software (SEER*Stat), and observational research resources.

  11. Software Safety Risk in Legacy Safety-Critical Computer Systems

    Science.gov (United States)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  12. Evaluation of low-cost commercial-off-the-shelf autopilot systems for SUAS operations

    Science.gov (United States)

    Brown, Calvin Thomas

    With this increase in unmanned aircraft system (UAS) operations, there is a need for a structured process to evaluate different commercially available systems, particularly autopilots. The Remotely Operated Aircraft Management, Interpretation, and Navigation from Ground or ROAMING scale was developed to meet this need. This scale is a modification of the widely accepted Handling Qualities Rating scale developed by George Cooper and Robert Harper Jr. The Cooper-Harper scale allows pilots to rate a vehicle's performance in completing some task. Similarly, the ROAMING scale allows UAS operators to evaluate the management and observability of UAS in completing some task. The standardized evaluative process consists of cost, size, weight, and power (SWAP) analysis, ease of implementation through procedural description of setup, ROAMING scale rating, a slightly modified NASA TLX rating, and comparison of manual operation to autonomous operation of the task. This standard for evaluation of autopilots and their software will lead to better understanding of the workload placed on UAS operators and indicate where improvements to design and operational procedures can be made. An assortment of low-cost commercial-off-the-shelf (COTS) autopilots were selected for use in the development of the evaluation and results of these tests demonstrate the commonalities and differences in these systems.

  13. Commercial Skills Test Information Management System (CSTIMS) final report and self-sustainability plan.

    Science.gov (United States)

    2014-03-01

    The Commercial Skills Test Information Management System (CSTIMS) was developed as a Web-based, software-as-a-service system to prevent and deter fraud perpetrated by third-party commercial drivers license (CDL) examiners in the portion of the CDL...

  14. Cracking Advanced Encryption Standard-A Review

    Directory of Open Access Journals (Sweden)

    Jashnil Kumar

    2017-07-01

    Full Text Available Password protection is a major security concern the world is facing today. While there are many publications available that discuss ways to protect passwords and data how widely user from around the world adhere to these rules are unknown. The novelty of this study is that this is the first time a review is done on software tools that can be used to crack Advanced Encryption Standards. Firstly the study does a review on top 10 software tools that are available to crack Advanced Encryption Standards. After which an analysis on two software tools was performed to see how long each software tool took to crack a password. The result of the study gives Advanced Encryption Standard researcher Network security researcher and the general public helpful information on how to strengthen advanced encryption standards and strengthen passwords that are hard for the software tools discussed above to crack.

  15. 7 CFR 51.2832 - U.S. Commercial.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false U.S. Commercial. 51.2832 Section 51.2832 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing.... Commercial. U.S. Commercial consists of onions which meet the following requirements: (a) Basic requirements...

  16. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  17. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  18. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  19. Commercial lumber, round timbers, and ties

    Science.gov (United States)

    David E. Kretschmann

    2010-01-01

    When sawn, a log yields round timber, ties, or lumber of varying quality. This chapter presents a general discussion of grading, standards, and specifications for these commercial products. In a broad sense, commercial lumber is any lumber that is bought or sold in the normal channels of commerce. Commercial lumber may be found in a variety of forms, species, and types...

  20. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  1. New AICPA standards aid accounting for the costs of internal-use software.

    Science.gov (United States)

    Luecke, R W; Meeting, D T; Klingshirn, R G

    1999-05-01

    Statement of Position (SOP) No. 98-1, "Accounting for the Costs of Computer Software Developed or Obtained for Internal Use," issued by the American Institute of Certified Public Accountants in March 1998, provides financial managers with guidelines regarding which costs involved in developing or obtaining internal-use software should be expensed and which should be capitalized. The SOP identifies three stages in the development of internal-use software: the preliminary project stage, the application development stage, and the postimplementation-operation stage. The SOP provides that all costs incurred during the preliminary project stage should be expensed as incurred. During the application development stage, costs associated with developing or obtaining the software should be capitalized, while costs associated with preparing data for use within the new system should be expensed. Costs incurred during the postimplementation-operation stage, typically associated with training and application maintenance, should be expensed.

  2. Verification and software validation for nuclear instrumentation

    International Nuclear Information System (INIS)

    Gaytan G, E.; Salgado G, J. R.; De Andrade O, E.; Ramirez G, A.

    2014-10-01

    In this work is presented a Verification Methodology and Software Validation, to be applied in instruments of nuclear use with associate software. This methodology was developed under the auspices of IAEA, through the regional projects RLA4022 (ARCAL XCIX) and RLA1011 (RLA CXXIII), led by Mexico. In the first project three plans and three procedures were elaborated taking into consideration IEEE standards, and in the second project these documents were updated considering ISO and IEC standards. The developed methodology has been distributed to the participant countries of Latin America in the ARCAL projects and two related courses have been imparted with the participation of several countries, and participating institutions of Mexico like Instituto Nacional de Investigaciones Nucleares (ININ), Comision Federal de Electricidad (CFE) and Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). In the ININ due to the necessity to work with Software Quality Guarantee in systems for the nuclear power plant of the CFE, a Software Quality Guarantee Plan and five procedures were developed in the year 2004, obtaining the qualification of the ININ for software development for the nuclear power plant of CFE. These first documents were developed taking like reference IEEE standards and regulator guides of NRC, being the first step for the development of the methodology. (Author)

  3. CGNS Mid-Level Software Library and Users Guide

    Science.gov (United States)

    Poirier, Diane; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: - The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; - The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; - The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and - The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The CGNS Mid-level Library was designed to ease the implementation of CGNS by providing developers with a collection of handy I/O functions. Since knowledge of the ADF core is not required to use this library, it will greatly facilitate the task of interfacing with CGNS. There are currently 48 user callable functions that comprise the Mid-level library and are described in the Users Guide. The library is written in

  4. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  5. Thermal comfort in commercial kitchens (RP-1469)

    DEFF Research Database (Denmark)

    Simone, Angela; Olesen, Bjarne W.; Stoops, John L.

    2013-01-01

    The indoor climate in commercial kitchens is often unsatisfactory, and working conditions can have a significant effect on employees’ comfort and productivity. The type of establishment (fast food, casual, etc.) and climatic zone can influence thermal conditions in the kitchens. Moreover, the size...... and arrangement of the kitchen zones, appliances, etc., further complicate an evaluation of the indoor thermal environment in commercial kitchens. In general, comfort criteria are stipulated in international standards (e.g., ASHRAE 55 or ISO EN 7730), but are these standardized methods applicable...... dissatisfied (PMV/PPD) index is not directly appropriate for all thermal conditions in commercial kitchens....

  6. Commercial Skills Test Information Management System final report and self-sustainability plan : [technology brief].

    Science.gov (United States)

    2014-04-01

    The Commercial Skills Test Information Management System (CSTIMS) was developed to address the fraudulent issuance of commercial drivers licenses (CDLs) across the United States. CSTIMS was developed as a Web-based, software-as-a-service system to...

  7. Software product quality measurement

    OpenAIRE

    Godliauskas, Eimantas

    2016-01-01

    This paper analyses Ruby product quality measures, suggesting three new measures for Ruby product quality measurement tool Rubocop to measure Ruby product quality characteristics defined in ISO 2502n standard series. This paper consists of four main chapters. The first chapter gives a brief view of software product quality and software product quality measurement. The second chapter analyses object oriented quality measures. The third chapter gives a brief view of the most popular Ruby qualit...

  8. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  9. Free and open source software for the manipulation of digital images.

    Science.gov (United States)

    Solomon, Robert W

    2009-06-01

    Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.

  10. Software defined radio (SDR) architecture for concurrent multi-satellite communications

    Science.gov (United States)

    Maheshwarappa, Mamatha R.

    SDRs have emerged as a viable approach for space communications over the last decade by delivering low-cost hardware and flexible software solutions. The flexibility introduced by the SDR concept not only allows the realisation of concurrent multiple standards on one platform, but also promises to ease the implementation of one communication standard on differing SDR platforms by signal porting. This technology would facilitate implementing reconfigurable nodes for parallel satellite reception in Mobile/Deployable Ground Segments and Distributed Satellite Systems (DSS) for amateur radio/university satellite operations. This work outlines the recent advances in embedded technologies that can enable new communication architectures for concurrent multi-satellite or satellite-to-ground missions where multi-link challenges are associated. This research proposes a novel concept to run advanced parallelised SDR back-end technologies in a Commercial-Off-The-Shelf (COTS) embedded system that can support multi-signal processing for multi-satellite scenarios simultaneously. The initial SDR implementation could support only one receiver chain due to system saturation. However, the design was optimised to facilitate multiple signals within the limited resources available on an embedded system at any given time. This was achieved by providing a VHDL solution to the existing Python and C/C++ programming languages along with parallelisation so as to accelerate performance whilst maintaining the flexibility. The improvement in the performance was validated at every stage through profiling. Various cases of concurrent multiple signals with different standards such as frequency (with Doppler effect) and symbol rates were simulated in order to validate the novel architecture proposed in this research. Also, the architecture allows the system to be reconfigurable by providing the opportunity to change the communication standards in soft real-time. The chosen COTS solution provides a

  11. A summary of available Rietveld and related software resources

    International Nuclear Information System (INIS)

    Cranswick, L.M.D

    1999-01-01

    Full text: There is a wide variety of Rietveld software available from both commercial and academic institutions. These resources will be reviewed along with the importance of validating Rietveld quantitative analysis results and gaining skills in complementary areas. This part of the workshop will mention a number of relevant programs and their main functionality: Rietveld software, Rietveld utilities, Rietveld friendly software and other powder diffraction utilities. Some of the main benefits of appreciating the availability of a wide range of software include: cross validation of results, data quality and diffractometer alignment, exploring new areas of research and analysis, solving new problems, nuances that occur as part of Rietveld analysis. Copyright (1999) Australian X-ray Analytical Association Inc

  12. Software Comparison for Renewable Energy Deployment in a Distribution Network

    Energy Technology Data Exchange (ETDEWEB)

    Gao, David Wenzhong [Alternative Power Innovations, LLC, Sharonville, OH (United States); Muljadi, Eduard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tian, Tian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Miller, Mackay [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-22

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercial tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.

  13. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  14. Implementation of Simulator Functions with Stimulated Commercial MMI for Full Scope Simulators

    International Nuclear Information System (INIS)

    Shin, Yeong Cheol; Kang, Sung Kon; Park, Jun Mo; Kim, Jang Hwan

    2014-01-01

    In order to train and qualify the operators and validate control room ensembles including MMIs and operating procedures, the utility must acquire a full scope simulator that is highly faithful to meet the requirements in ANSI/ANS 3.5. For Shin-Kori 3,4 nuclear power plant, so called stimulation approach has been adopted for developing control room MMIs and control logic of the full scope simulator. In stimulation approach, the actual plant (i. e. SKN 3,4) software and configuration data are used for implementing the simulator. The modeling of the MMI using the emulation method is very difficult and often infeasible for highly complex MMI software not only because the development cost is prohibitively high but also achieving the faithful modeling of the look and feel of the reference MMI software, particularly the timing requirements associated with the interactions between operators and system is extremely difficult. However, there are challenges in this stimulation approach. It is difficult or sometimes impossible to add functions for simulation purposes such as simulator control (i. e., Freeze/Run) and malfunctions by modifying the actual plant MMI software containing Commercial Black-box Software (CBSW). These days, DCS MMI software is highly likely to contain commercial software that is a black-box for simulator developer because the supplier of the plant MMI software does not open the source codes and its associated technology to protect their business interests

  15. Software engineering of a navigation and guidance system for commercial aircraft

    Science.gov (United States)

    Lachmann, S. G.; Mckinstry, R. G.

    1975-01-01

    The avionics experimental configuration of the considered system is briefly reviewed, taking into account the concept of an advanced air traffic management system, flight critical and noncritical functions, and display system characteristics. Cockpit displays and the navigation computer are examined. Attention is given to the functions performed in the navigation computer, major programs in the navigation computer, and questions of software development.

  16. Online Learning Software – Why Pay for It?

    OpenAIRE

    Jim FLOOD

    2007-01-01

    Numbers with pound signs in front and four noughts following them are quite usual for the basic price of e-learning software. In spite of the high cost of software and criticism of it, many organizations are still locking themselves into expensive contracts when there are freely available alternatives that can deliver most of the attributes of commercially available Learning Management Systems (LMS). Learning Management Systems were developed amid the dot com boom of the 90s and are typical o...

  17. Development of a specific geological mapping software under MAPGIS

    International Nuclear Information System (INIS)

    Zhang Wenkai

    2010-01-01

    The most often used mapping software in geological exploration is MAPGIS system, and related standard is established based on it. The software has more agile functions, except for the following shortages: more parameters to select, difficult to master, different parameters to use for each one, low efficiency. As a result, a specific software is developed for geological mapping by using VC++ on the platform of MAPGIS. According to the standards, toolbars are built for strata, rock, geographic information and materials, etc. By pushing on the buttons, the parameters are selected, and menus of toolbars can be modified to select parameters for each working areas, legends can be sorted automatically. So, the speed can be improved greatly, and the parameters can be identical. The software can complete the transition between Gauss coordinate and longitude-latitude coordinate, drawing points, frames by longitude-latitude, responsible form, plain diagram and profile, etc. The software also improves the way of clipping, topologizing, node catching methods. The application of the software indicates that it can improve the speed of geological mapping greatly, and can improve the standardized level of the final maps. (authors)

  18. Software agents for the dissemination of remote terrestrial sensing data

    Science.gov (United States)

    Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.

    1994-01-01

    Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation

  19. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  20. Producing and supporting sharable software

    International Nuclear Information System (INIS)

    Johnstad, H.; Nicholls, J.

    1987-02-01

    A survey is reported that addressed the question of shareable software for the High Energy Physics community. Statistics are compiled for the responses of 54 people attending a conference on the subject of shareable software to a questionnaire which addressed the usefulness of shareable software, preference of programming language, and source management tools. The results are found to reflect a continued need for shareable software in the High Energy Physics community and that this effort be performed in coordination. A strong mandate is also claimed for large facilities to support the community with software and that these facilities should act as distribution points. Considerable interest is expressed in languages other than FORTRAN, and the desire for standards or rules in programming is expressed. A need is identified for source management tools

  1. Regulated software meets DevOps

    DEFF Research Database (Denmark)

    Laukkarinen, Teemu; Kuusinen, Kati; Mikkonen, Tommi

    2018-01-01

    Context: Regulatory authorities require proofs from critical systems manufacturers that the software in their products is developed in accordance to prescribed development practices before accepting the product to the markets. This is challenging when using DevOps, where continuous integration...... and deployment are the default practices, which are not a good match with the regulatory software development standards. Objective: We aim to bring DevOps and regulated software development closer to each other. First, we want to make it easier for developers to develop regulated software with tools...... and practices they are familiar with. Second, we want to allow regulatory authorities to build confidence on solutions provided by manufacturers by defining a mapping between DevOps and regulatory software development. Method: We performed a literature survey and created research suggestions using exploratory...

  2. Energy Code Compliance in a Detailed Commercial Building Sample: The Effects of Missing Data

    Energy Technology Data Exchange (ETDEWEB)

    Biyani, Rahul K.; Richman, Eric E.

    2003-09-30

    Most commercial buildings in the U.S. are required by State or local jurisdiction to meet energy standards. The enforcement of these standards is not well known and building practice without them on a national scale is also little understood. To provide an understanding of these issues, a database has been developed at PNNL that includes detailed energy related building characteristics of 162 commercial buildings from across the country. For this analysis, the COMcheck? compliance software (developed at PNNL) was used to assess compliance with energy codes among these buildings. Data from the database for each building provided the program input with percentage energy compliance to the ASHRAE/IESNA Standard 90.1-1999 energy as the output. During the data input process it was discovered that some essential data for showing compliance of the building envelope was missed and defaults had to be developed to provide complete compliance information. This need for defaults for some data inputs raised the question of what the effect on documenting compliance could be due to missing data. To help answer this question a data collection effort was completed to assess potential differences. Using the program Dodge View, as much of the missing envelope data as possible was collected from the building plans and the database input was again run through COMcheck?. The outputs of both compliance runs were compared to see if the missing data would have adversely affected the results. Both of these results provided a percentage compliance of each building in the envelope and lighting categories, showing by how large a percentage each building either met or fell short of the ASHRAE/IESNA Standard 90.1-1999 energy code. The results of the compliance runs showed that 57.7 % of the buildings met or exceeded envelope requirements with defaults and that 68 % met or exceeded envelope requirements with the actual data. Also, 53.6 % of the buildings met or surpassed the lighting requirements

  3. Traceability of Software Safety Requirements in Legacy Safety Critical Systems

    Science.gov (United States)

    Hill, Janice L.

    2007-01-01

    How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?

  4. Quality Assurance Results for a Commercial Radiosurgery System: A Communication.

    Science.gov (United States)

    Ruschin, Mark; Lightstone, Alexander; Beachey, David; Wronski, Matt; Babic, Steven; Yeboah, Collins; Lee, Young; Soliman, Hany; Sahgal, Arjun

    2015-10-01

    The purpose of this communication is to inform the radiosurgery community of quality assurance (QA) results requiring attention in a commercial FDA-approved linac-based cone stereo-tactic radiosurgery (SRS) system. Standard published QA guidelines as per the American Association of Physics in Medicine (AAPM) were followed during the SRS system's commissioning process including end-to-end testing, cone concentricity testing, image transfer verification, and documentation. Several software and hardware deficiencies that were deemed risky were uncovered during the process and QA processes were put in place to mitigate these risks during clinical practice. In particular, the present work focuses on daily cone concentricity testing and commissioning-related findings associated with the software. Cone concentricity/alignment is measured daily using both optical light field inspection, as well as quantitative radiation field tests with the electronic portal imager. In 10 out of 36 clini-cal treatments, adjustments to the cone position had to be made to align the cone with the collimator axis to less than 0.5 mm and on two occasions the pre-adjustment measured offset was 1.0 mm. Software-related errors discovered during commissioning included incorrect transfer of the isocentre in DICOM coordinates, improper handling of non-axial image sets, and complex handling of beam data, especially for multi-target treatments. QA processes were established to mitigate the occurrence of the software errors. With proper QA processes, the reported SRS system complies with tolerances set out in established guidelines. Discussions with the vendor are ongoing to address some of the hardware issues related to cone alignment. © The Author(s) 2014.

  5. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  6. 36 CFR 27.2 - Commercial and industrial activities.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Commercial and industrial... INTERIOR CAPE COD NATIONAL SEASHORE; ZONING STANDARDS § 27.2 Commercial and industrial activities. No commercial or industrial districts may be established within the Cape Cod National Seashore. ...

  7. National Software Reference Library (NSRL)

    Science.gov (United States)

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  8. Evaluation of the finite element software ABAQUS for biomechanical modelling of biphasic tissues.

    Science.gov (United States)

    Wu, J Z; Herzog, W; Epstein, M

    1998-02-01

    The biphasic cartilage model proposed by Mow et al. (1980) has proven successful to capture the essential mechanical features of articular cartilage. In order to analyse the joint contact mechanics in real, anatomical joints, the cartilage model needs to be implemented into a suitable finite element code to approximate the irregular surface geometries of such joints. However, systematic and extensive evaluation of the capacity of commercial software for modelling the contact mechanics with biphasic cartilage layers has not been made. This research was aimed at evaluating the commercial finite element software ABAQUS for analysing biphasic soft tissues. The solutions obtained using ABAQUS were compared with those obtained using other finite element models and analytical solutions for three numerical tests: an unconfined indentation test, a test with the contact of a spherical cartilage surface with a rigid plate, and an axi-symmetric joint contact test. It was concluded that the biphasic cartilage model can be implemented into the commercial finite element software ABAQUS to analyse practical joint contact problems with biphasic articular cartilage layers.

  9. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    Science.gov (United States)

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  10. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  11. Software Quality Certification: identifying the real obstacles

    Directory of Open Access Journals (Sweden)

    Megan Baker

    1996-05-01

    Full Text Available A case study of software certification reveals the real difficulty of certifying quality beyond superficial assessment - readers are invited to form their own conclusions. AS 3563 Software Quality Management System is the Australian version of ISO 9001, developed specifically for the software industry. For many Australian software houses, gaining certification with AS 3563 is a priority since certification has become a prerequisite to doing business with government departments and major corporations. However, the process of achieving registration with this standard is a lengthy and resource intensive process, and may have little impact on actual software quality. This case study recounts the experience of the consulting arm of one of Australia's accounting firms in its quest for certification. By using a number of specific management strategies this company was able to successfully implement AS 3563 in less than half the time usually taken to achieve certification - a feat for which its management should be congratulated. However, because the focus of the project was on gaining certification, few internal benefits have been realised despite the successful implementation of the standard.

  12. Windows Calorimeter Control (WinCal) program computer software configuration management plan

    International Nuclear Information System (INIS)

    1997-01-01

    This document describes the system configuration management activities performed in support of the Windows Calorimeter Control (WinCal) system, in accordance with Site procedures based on Institute of Electrical and Electronic Engineers (IEEE) Standard 828-1990, Standard for Software Configuration Management Plans (IEEE 1990) and IEEE Standard 1042-1987, Guide to Software Configuration Management (IEEE 1987)

  13. Thermal Environment evaluation in Commercial kitchens

    DEFF Research Database (Denmark)

    Simone, Angela; Olesen, Bjarne W.

    2012-01-01

    as commercial kitchens? There is therefore a need to study the indoor environment in commercial kitchens and to establish standardized methods and procedures for setting criteria that have to be met for the design and operation of kitchens. The present paper introduces a data collection protocol based......The indoor climate in commercial kitchens is often unsatisfactory and the working conditions can have a significant effect on employees’ comfort and productivity. The type (fast food, casual, etc.) and climatic zone can influence the thermal conditions in the kitchens. Moreover, size...... and arrangement of the kitchen zones, appliances, etc., complicate further an evaluation of the indoor thermal environment in kitchens. In general, comfort criteria are expressed in international standards such as ASHRAE 55 or ISO EN7730. But are these standardised methods applicable for such environments...

  14. Advanced Data Format (ADF) Software Library and Users Guide

    Science.gov (United States)

    Smith, Matthew; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF

  15. Building quality into performance and safety assessment software

    International Nuclear Information System (INIS)

    Wojciechowski, L.C.

    2011-01-01

    Quality assurance is integrated throughout the development lifecycle for performance and safety assessment software. The software used in the performance and safety assessment of a Canadian deep geological repository (DGR) follows the CSA quality assurance standard CSA-N286.7 [1], Quality Assurance of Analytical, Scientific and Design Computer Programs for Nuclear Power Plants. Quality assurance activities in this standard include tasks such as verification and inspection; however, much more is involved in producing a quality software computer program. The types of errors found with different verification methods are described. The integrated quality process ensures that defects are found and corrected as early as possible. (author)

  16. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  17. Professional Ethics of Software Engineers: An Ethical Framework.

    Science.gov (United States)

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  18. Reflections on Courses for Software Language Engineering

    NARCIS (Netherlands)

    Bagge, A.H.; Lämmel, R.; Zaytsev, V.; Demuth, B.; Stikkolorum, D.

    2014-01-01

    Software Language Engineering (SLE) has emerged as a field in computer science research and software engineering, but it has yet to become entrenched as part of the standard curriculum at universities. Many places have a compiler construction (CC) course and a programming languages (PL) course, but

  19. Software architecture considerations for ion source control systems

    International Nuclear Information System (INIS)

    Sinclair, J.W.

    1997-09-01

    General characteristics of distributed control system software tools are examined from the perspective of ion source control system requirements. Emphasis is placed on strategies for building extensible, distributed systems in which the ion source element is one component of a larger system. Vsystem, a commercial software tool kit from Vista Control Systems was utilized extensively in the control system upgrade of the Holifield Radioactive Ion Beam Facility. Part of the control system is described and the characteristics of Vsystem are examined and compared with those of EPICS, the Experimental Physics and Industrial Control System

  20. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  1. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  2. Safe Software for Space Applications: Building on the DO-178 Experience

    Science.gov (United States)

    Dorsey, Cheryl A.; Dorsey, Timothy A.

    2013-09-01

    DO-178, Software Considerations in Airborne Systems and Equipment Certification, is the well-known international standard dealing with the assurance of software used in airborne systems [1,2]. Insights into the DO-178 experiences, strengths and weaknesses can benefit the international space community. As DO-178 is an excellent standard for safe software development when used appropriately, this paper provides lessons learned and suggestions for using it effectively.

  3. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  4. Preparation of high purity plutonium oxide for radiochemistry instrument calibration standards and working standards

    International Nuclear Information System (INIS)

    Wong, A.S.; Stalnaker, N.D.

    1997-04-01

    Due to the lack of suitable high level National Institute of Standards and Technology (NIST) traceable plutonium solution standards from the NIST or commercial vendors, the CST-8 Radiochemistry team at Los Alamos National Laboratory (LANL) has prepared instrument calibration standards and working standards from a well-characterized plutonium oxide. All the aliquoting steps were performed gravimetrically. When a 241 Am standardized solution obtained from a commercial vendor was compared to these calibration solutions, the results agreed to within 0.04% for the total alpha activity. The aliquots of the plutonium standard solutions and dilutions were sealed in glass ampules for long term storage

  5. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  6. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  7. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  8. The Effective Use of System and Software Architecture Standards for Software Technology Readiness Assessments

    Science.gov (United States)

    2011-05-01

    icons, mouse- control and network paradigms. Successfully directed engineering and quality process development on all levels of the enterprise. As...Actual system proven through successful mission operations A t l t l t d d lifi d th h t t d TRL 9 TRL 8 c ua sys em comp e e an qua e roug es an...A Software Technology Example • Net Centricity – a typical, new mission requirement – Network Centric Warfare (NCW) • NCW is a state-of-the art war

  9. A comparison of software programs to determine curie content

    International Nuclear Information System (INIS)

    Hansen, C.J.; Miller C.C.

    1995-01-01

    Commercial nuclear power plants have used various methods to determine the curie content of radwaste packages to comply with shipping and disposal regulations. Several computer software packages are available which can determine the curie content of a package based on the geometry of the package and the dose rate of the package provided a given source spectrum. This paper will compare three of the more commonly used software packages. A brief review of the selection and use of software programs at Diablo Canyon Power Plant for radwaste and radioactive material shipments will be provided. These software packages are the PAKRAD program by Bechtel (which utilizes EPRI DOSCON data), RAMSHP by WMG and MICROSHIELD by Grove Engineering. A comparison of the software packages in the calculation of curie content for a box of dry active waste and a cartridge filter will be presented. A summary of program limitations will also be provided

  10. SpaceWire Driver Software for Special DSPs

    Science.gov (United States)

    Clark, Douglas; Lux, James; Nishimoto, Kouji; Lang, Minh

    2003-01-01

    A computer program provides a high-level C-language interface to electronics circuitry that controls a SpaceWire interface in a system based on a space qualified version of the ADSP-21020 digital signal processor (DSP). SpaceWire is a spacecraft-oriented standard for packet-switching data-communication networks that comprise nodes connected through bidirectional digital serial links that utilize low-voltage differential signaling (LVDS). The software is tailored to the SMCS-332 application-specific integrated circuit (ASIC) (also available as the TSS901E), which provides three highspeed (150 Mbps) serial point-to-point links compliant with the proposed Institute of Electrical and Electronics Engineers (IEEE) Standard 1355.2 and equivalent European Space Agency (ESA) Standard ECSS-E-50-12. In the specific application of this software, the SpaceWire ASIC was combined with the DSP processor, memory, and control logic in a Multi-Chip Module DSP (MCM-DSP). The software is a collection of low-level driver routines that provide a simple message-passing application programming interface (API) for software running on the DSP. Routines are provided for interrupt-driven access to the two styles of interface provided by the SMCS: (1) the "word at a time" conventional host interface (HOCI); and (2) a higher performance "dual port memory" style interface (COMI).

  11. Open Source Software and the Intellectual Commons.

    Science.gov (United States)

    Dorman, David

    2002-01-01

    Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…

  12. Software for dimensioning of hydraulic ram; Software para dimensionamento de carneiro hidraulico

    Energy Technology Data Exchange (ETDEWEB)

    Borges Neto, Manuel Rangel [Centro Federal de Educacao Tecnologica de Petrolina, CE (Brazil); Borges, Grace Anne Pontes [Faculdade de Tecnologia de Sao Paulo (FATEC), Sao Paulo, SP (Brazil). Dept. de Processamento de Dados; Borges, Everton Pontes [Centro Federal de Educacao Tecnologica do Rio Grande do Norte, Natal, RN (Brazil). Curso Tecnologia em Automacao Industrial

    2004-07-01

    The search for new renewable energy sources sometimes takes us always from existing solutions applications. The hydraulic ram is equipment developed in 1796, used to water pumping, without using electricity energy and can be used for small rural producer in places where the conventional electricity grid access is limited. The objective of this work is Software introducing, developed to help a commercial hydraulic ram dimensioning, which isn't necessary previous hydraulic knowledge, and can also be used as a didactic resource at technicians and technologists courses in subjects as hydraulics or irrigation. (author)

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  14. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  15. A Systematic Analysis of Functional Safety Certification Practices in Industrial Robot Software Development

    Directory of Open Access Journals (Sweden)

    Tong Xie

    2017-01-01

    Full Text Available For decades, industry robotics have delivered on the promise of speed, efficiency and productivity. The last several years have seen a sharp resurgence in the orders of industrial robots in China, and the areas addressed within industrial robotics has extended into safety-critical domains. However, safety standards have not yet been implemented widely in academia and engineering applications, particularly in robot software development. This paper presents a systematic analysis of functional safety certification practices in software development for the safety-critical software of industrial robots, to identify the safety certification practices used for the development of industrial robots in China and how these practices comply with the safety standard requirements. Reviewing from Chinese academic papers, our research shows that safety standards are barely used in software development of industrial robot. The majority of the papers propose various solutions to achieve safety, but only about two thirds of the papers refer to non-standardized approaches that mainly address the systematic level rather than the software development level. In addition, our research shows that with the development of artificial intelligent, an emerging field is still on the quest for standardized and suitable approaches to develop safety-critical software.

  16. Chemical surveillance of commercial fast breeder reactors

    International Nuclear Information System (INIS)

    Stamm, H.H.; Stade, K.Ch.

    1988-01-01

    After BN-600 (USSR) and SUPERPHENIX (France) were started succesfully, the international development of LMFBRs is standing at the doorstep of commercial use. For commercial use of LMFBRs cost reductions for construction and operation are highly desirable and necessary. Several nations developing breeder reactors have joined in a common effort in order to reach this aim by standardization and harmonization. On the base of more than 20 years of operation experience of experimental reactors (EBR-II, FFTF, RAPSODIE, DFR, BR-5/BR-10, BOR-60, JOYO, KNK-II) and demonstration plants (PHENIX, PFR, BN-350), possibilities for standardization in chemical surveillance of commercial breeder reactors without any loss of availability, reliability and reactor safety will be discussed in the following chapters. Loop-type reactors will be considered as well as pool-type reactors, although all commercial plants under consideration so far (SUPERPHENIX II, BN-800, BN-1600, CFBR, SNR-2, EFR) include pool-type reactors only. Table 1 gives a comparison of the Na inventories of test reactors, prototype plants and commercial LMFBRs

  17. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  18. Testing of real-time-software

    International Nuclear Information System (INIS)

    Friesland, G.; Ovenhausen, H.

    1975-05-01

    The situation in the area of testing real-time-software is unsatisfactory. During the first phase of the project PROMOTE (prozessorientiertes Modul- und Gesamttestsystem) an analysis of the momentary situation took place, results of which are summarized in the following study about some user interviews and an analysis of relevant literature. 22 users (industry, software-houses, hardware-manufacturers, and institutes) have been interviewed. Discussions were held about reliability of real-time software with special interest to error avoidance, testing, and debugging. Main aims of the analysis of the literature were elaboration of standard terms, comparison of existing test methods and -systems, and the definition of boundaries to related areas. During the further steps of this project some means and techniques will be worked out to systematically test real-time software. (orig.) [de

  19. Experiment to evaluate software safety

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-01-01

    The process of licensing nuclear power plants for operation consists of mandatory steps featuring detailed examination of the instrumentation and control system by the safety authorities, including softwares. The criticality of these softwares obliges the manufacturer to develop in accordance with the IEC 880 standard 'Computer software in nuclear power plant safety systems' issued by the International Electronic Commission. The evaluation approach, a two-stage assessment is described in detail. In this context, the IPSN (Institute of Protection and Nuclear Safety), the technical support body of the safety authority uses the MALPAS tool to analyse the quality of the programs. (R.P.). 4 refs

  20. An Introduction to Flight Software Development: FSW Today, FSW 2010

    Science.gov (United States)

    Gouvela, John

    2004-01-01

    Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by

  1. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    Science.gov (United States)

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  2. Property-Based Software Engineering Measurement

    Science.gov (United States)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  3. Nasa-wide Standard Administrative Systems

    Science.gov (United States)

    Schneck, P.

    1984-01-01

    Factors to be considered in developing agency-wide standard administrative systems for NASA include uniformity of hardware and software; centralization vs. decentralization; risk exposure; and models for software development.

  4. Windows Calorimeter Control (WinCal) program computer software design description

    International Nuclear Information System (INIS)

    Pertzborn, N.F.

    1997-01-01

    The Windows Calorimeter Control (WinCal) Program System Design Description contains a discussion of the design details for the WinCal product. Information in this document will assist a developer in maintaining the WinCal system. The content of this document follows the guidance in WHC-CM-3-10, Software Engineering Standards, Standard for Software User Documentation

  5. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  6. Use of Commericially Available Software in an Attribute Measurement System

    International Nuclear Information System (INIS)

    MacArthur, Duncan W.; Bracken, David S.; Carrillo, Louis A.; Elmont, Timothy H.; Frame, Katherine C.; Hirsch, Karen L.

    2005-01-01

    A major issue in international safeguards of nuclear materials is the ability to verify that processes and materials in nuclear facilities are consistent with declaration without revealing sensitive information. An attribute measurement system (AMS) is a non-destructive assay (NDA) system that utilizes an information barrier to protect potentially sensitive information about the measurement item. A key component is the software utilized for operator interface, data collection, analysis, and attribute determination, as well as the operating system under which they are implemented. Historically, custom software has been used almost exclusively in transparency applications, and it is unavoidable that some amount of custom software is needed. The focus of this paper is to explore the extent to which commercially available software may be used and the relative merits.

  7. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 9

    Science.gov (United States)

    2006-09-01

    activities to ISO /IEC 15288 system life cycle and ISO /IEC 12207 software life cycle processes. • Microsoft Security Development Lifecycle (SDL) [18, 19...Standardization/International Electrotechnical Commission ( ISO / IEC) Standard 15026 System and Software Assurance, which adds securi- ty assurance...Software ProcessSM (TSPSM Secure) [21]. The CMM and ISO /IEC process models are defined at a higher level of abstraction than SDL and CLASP, which

  8. National software infrastructure for lattice gauge theory

    International Nuclear Information System (INIS)

    Brower, Richard C

    2005-01-01

    The current status of the SciDAC software infrastructure project for lattice gauge theory is summarized. This includes the the design of a QCD application programmers interface (API) that allows existing and future codes to be run efficiently on Terascale hardware facilities and to be rapidly ported to new dedicated or commercial platforms. The critical components of the API have been implemented and are in use on the US QCDOC hardware at BNL and on both the switched and mesh architecture Pentium 4 clusters at Fermi National Accelerator Laboratory (FNAL) and Thomas Jefferson National Accelerator Facility (JLab). Future software infrastructure requirements and research directions are also discussed

  9. EPICS: A control system software co-development success story

    International Nuclear Information System (INIS)

    Knott, M.; Gurd, D.; Lewis, S.; Thuot, M.

    1993-01-01

    The Experimental Physics and Industrial Control Systems (EPICS) is the result of a software sharing and co-development effort of major importance now underway. The initial two participants, LANL and ANL, have now been joined by three other labs, and an earlier version of the software has been transferred to three commercial firms and is currently undergoing separate development. The reasons for EPICS's success may be useful to enumerate and explain and the desire and prospects for its continued development are certainly worth examining

  10. Software-Programmed Optical Networking with Integrated NFV Service Provisioning

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Wang, Xi; Basu, Shrutarshi

    2017-01-01

    We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS).......We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS)....

  11. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  12. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  13. Software process improvement in CMS-are we different?

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2001-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise in our context means to evaluate and apply new technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards, while ensuring reproducibility and quality of results. The CMS process improvement effort is two-pronged. It aims at continuous improvement of the ways we do Object Oriented software, as well as continuous improvement in the efficiency of the working environment. In particular the use and creation of de-facto software process standards within CMS has proven to be key to successful software process improvement program. The authors describe the successful CMS implementation of a software process improvement strategy, following ISO 15504 since three years. The authors give the current status of the most important processes families formally established in CMS, and provide the guidelines followed both for tool development, and methodology establishment

  14. Development of a visualized software for tokamak experiment data processing

    International Nuclear Information System (INIS)

    Cao Jianyong; Ding Xuantong; Luo Cuiwen

    2004-01-01

    With the VBA programming in Microsoft Excel, the authors have developed a post-processing software of experimental data in tokamak. The standard formal data in the HL-1M and HL-2A tokamaks can be read, displayed in Excel, and transmitted directly into the MATLAB workspace, for displaying pictures in MATLAB with the software. The authors have also developed data post-processing software in MATLAB environment, which can read standard format data, display picture, supply visual graphical user interface and provide part of advanced signal processing ability

  15. Using commercial software products for atmospheric remote sensing

    Science.gov (United States)

    Kristl, Joseph A.; Tibaudo, Cheryl; Tang, Kuilian; Schroeder, John W.

    2002-02-01

    The Ontar Corporation (www.Ontar.com) has developed several products for atmospheric remote sensing to calculate radiative transport, atmospheric transmission, and sensor performance in both the normal atmosphere and the atmosphere disturbed by battlefield conditions of smoke, dust, explosives and turbulence. These products include: PcModWin: Uses the USAF standard MODTRAN model to compute the atmospheric transmission and radiance at medium spectral resolution (2 cm-1) from the ultraviolet/visible into the infrared and microwave regions of the spectrum. It can be used for any geometry and atmospheric conditions such as aerosols, clouds and rain. PcLnWin: Uses the USAF standard FASCOD model to compute atmospheric transmission and emission at high (line-by-line) spectral resolution using the HITRAN 2000 database. It can be used over the same spectrum from the UV/visible into the infrared and microwave regions of the spectrum. HitranPC: Computes the absolute high (line-by-line) spectral resolution transmission spectrum of the atmosphere for different temperatures and pressures. HitranPC is a user-friendly program developed by the University of South Florida (USF) and uses the international standard molecular spectroscopic database, HITRAN. LidarPC: A computer program to calculate the Laser Radar/L&n Equation for hard targets and atmospheric backscatter using manual input atmospheric parameters or HitranPC and BETASPEC - transmission and backscatter calculations of the atmosphere. Also developed by the University of South Florida (USF). PcEosael: is a library of programs that mathematically describe aspects of electromagnetic propagation in battlefield environments. 25 modules are connected but can be exercised individually. Covers eight general categories of atmospheric effects, including gases, aerosols and laser propagation. Based on codes developed by the Army Research Lab. NVTherm: NVTherm models parallel scan, serial scan, and staring thermal imagers that operate

  16. TOGAF usage in outsourcing of software development

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2013-12-01

    Full Text Available TOGAF is an Enterprise Architecture framework that provides a method for developing Enterprise Architecture called architecture development method (ADM. The purpose of this paper is whether TOGAF ADM can be used for developing software application architecture. Because the software application architecture is one of the disciplines in application development life cycle, it is important to find out how the enterprise architecture development method can support the application architecture development. Having an open standard that can be used in the application architecture development could help in outsourcing of software development. If ADM could be used for software application architecture development, then we could consider its usability in outsourcing of software development.

  17. DAQ: Software Architecture for Data Acquisition in Sounding Rockets

    Science.gov (United States)

    Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.

    2011-01-01

    A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.

  18. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  19. WLS software for the Los Alamos geophysical instrumentation truck

    International Nuclear Information System (INIS)

    Ideker, C.D.; LaDelfe, C.M.

    1985-01-01

    Los Alamos National Laboratory's capabilities for special downhole geophysical well logging has increased steadily over the past few years. Software was developed originally for each individual tool as it became operational. With little or no standardization for tool software modules, software development became redundant, time consuming, and cost ineffective. With long-term use and the rapid evolution of well logging capacity in mind. Los Alamos and EG and G personnel decided to purchase a software system. The system was designed to offer: wide-range use and programming flexibility; standardization subroutines for tool module development; user friendly operation which would reduce training time; operator error checking and alarm activation; maximum growth capacity for new tools as they are added to the inventory; and the ability to incorporate changes made to the computer operating system and hardware. The end result is a sophisticated and flexible software tool and for transferring downhole geophysical measurement data to computer disk files. This paper outlines the need, design, development, and implementation of the WLS software for geophysical data acquisition. A demonstration and working examples are included in the presentation

  20. On the Use of Safety Certification Practices in Autonomous Field Robot Software Development

    DEFF Research Database (Denmark)

    Mogensen, Johann Thor Ingibergsson; Schultz, Ulrik Pagh; Kuhrmann, Marco

    2015-01-01

    reactions or performance in malfunctioning systems, and influence industry regarding software development and project management. However, academia seemingly did not reach the same degree of utilisation of standards. This paper presents the findings from a systematic mapping study in which we study...... the state-of-the-art in developing software for safety-critical software for autonomous field robots. The purpose of the study is to identify practices used for the development of autonomous field robots and how these practices relate to available safety standards. Our findings from reviewing 49 papers show...... on the quest for suitable approaches to develop safety-critical software, awaiting appropriate standards for this support....

  1. Software configuration management plan for the Hanford site technical database

    International Nuclear Information System (INIS)

    GRAVES, N.J.

    1999-01-01

    The Hanford Site Technical Database (HSTD) is used as the repository/source for the technical requirements baseline and programmatic data input via the Hanford Site and major Hanford Project Systems Engineering (SE) activities. The Hanford Site SE effort has created an integrated technical baseline for the Hanford Site that supports SE processes at the Site and project levels which is captured in the HSTD. The HSTD has been implemented in Ascent Logic Corporation (ALC) Commercial Off-The-Shelf (COTS) package referred to as the Requirements Driven Design (RDD) software. This Software Configuration Management Plan (SCMP) provides a process and means to control and manage software upgrades to the HSTD system

  2. Software Intensive Systems Cost and Schedule Estimation

    Science.gov (United States)

    2013-06-13

    of labor counted in or across each activity. The activity data in  the SRDR is reported following the [ ISO   12207 ] processes for software development...Release  Table 19 ISO /IEC 12207 Development Activities System requirements analysis System architectural design A ct iv iti es in S RD R da ta... 12207 ]  ISO /IEC  12207 , International Standard on Information Technology  Software Lifecycle Processes, International Organization for  Standardization

  3. An open-source software program for performing Bonferroni and related corrections for multiple comparisons

    Directory of Open Access Journals (Sweden)

    Kyle Lesack

    2011-01-01

    Full Text Available Increased type I error resulting from multiple statistical comparisons remains a common problem in the scientific literature. This may result in the reporting and promulgation of spurious findings. One approach to this problem is to correct groups of P-values for "family-wide significance" using a Bonferroni correction or the less conservative Bonferroni-Holm correction or to correct for the "false discovery rate" with a Benjamini-Hochberg correction. Although several solutions are available for performing this correction through commercially available software there are no widely available easy to use open source programs to perform these calculations. In this paper we present an open source program written in Python 3.2 that performs calculations for standard Bonferroni, Bonferroni-Holm and Benjamini-Hochberg corrections.

  4. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  5. Performance testing of 3D point cloud software

    Directory of Open Access Journals (Sweden)

    M. Varela-González

    2013-10-01

    Full Text Available LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI. The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  6. Software for computers in the safety systems of nuclear power stations

    International Nuclear Information System (INIS)

    1987-08-01

    This standard includes the safety actuation systems, the safety system support features and the protection systems. The standard provides requirements for each stage of software generation, including design, development, qualification and operation as well as the documentation for each stage of the software generation for the purpose of achieving highly reliable software. The principles applied in developing these requirements include: Best available practice; top-down design methods; modularity; verification of each phase; clear documentation; auditable documents and validation testing. (orig./HP)

  7. Deploying a Route Optimization EFB Application for Commercial Airline Operational Trials

    Science.gov (United States)

    Roscoe, David A.; Vivona, Robert A.; Woods, Sharon E.; Karr, David A.; Wing, David J.

    2016-01-01

    The Traffic Aware Planner (TAP), developed for NASA Langley Research Center to support the Traffic Aware Strategic Aircrew Requests (TASAR) project, is a flight-efficiency software application developed for an Electronic Flight Bag (EFB). Tested in two flight trials and planned for operational testing by two commercial airlines, TAP is a real-time trajectory optimization application that leverages connectivity with onboard avionics and broadband Internet sources to compute and recommend route modifications to flight crews to improve fuel and time performance. The application utilizes a wide range of data, including Automatic Dependent Surveillance Broadcast (ADS-B) traffic, Flight Management System (FMS) guidance and intent, on-board sensors, published winds and weather, and Special Use Airspace (SUA) schedules. This paper discusses the challenges of developing and deploying TAP to various EFB platforms, our solutions to some of these challenges, and lessons learned, to assist commercial software developers and hardware manufacturers in their efforts to implement and extend TAP functionality in their environments. EFB applications (such as TAP) typically access avionics data via an ARINC 834 Simple Text Avionics Protocol (STAP) server hosted by an Aircraft Interface Device (AID) or other installed hardware. While the protocol is standardized, the data sources, content, and transmission rates can vary from aircraft to aircraft. Additionally, the method of communicating with the AID may vary depending on EFB hardware and/or the availability of onboard networking services, such as Ethernet, WIFI, Bluetooth, or other mechanisms. EFBs with portable and installed components can be implemented using a variety of operating systems, and cockpits are increasingly incorporating tablet-based technologies, further expanding the number of platforms the application may need to support. Supporting multiple EFB platforms, AIDs, avionics datasets, and user interfaces presents a

  8. Guideline on evaluation and acceptance of commercial grade digital equipment for nuclear safety applications

    International Nuclear Information System (INIS)

    1996-10-01

    Nuclear power plants are increasingly upgrading their instrumentation and control (I ampersand C) systems with commercial digital equipment, which allows them to continue meeting safety and reliability requirements while controlling operating costs. However, the use of commercial software-based devices for safety related applications has raised new issues that impact design, procurement, and licensing activities. This guideline describes a consistent, comprehensive approach for the evaluation and acceptance of commercial digital equipment for nuclear safety systems

  9. Comparison of CT- and radiograph-based post-implant dosimetry for transperineal 125I prostate brachytherapy using single seeds and a commercial treatment-planning software

    International Nuclear Information System (INIS)

    Siebert, F.A.; Kohr, P.; Kovacs, G.

    2006-01-01

    Background and purpose: the objective of this investigation was a direct comparison of the dosimetry of CT-based and radiograph-based postplanning procedures for seed implants. Patients and methods: CT- and radiograph-based postplans were carried out for eight iodine-125 ( 125 I) seed implant patients with a commercial treatment-planning system (TPS). To assess a direct comparison of the dosimetric indices (D90, V100, V400), the radiograph-based seed coordinates were transformed to the coordinate system of the CT postplan. Afterwards, the CT-based seed positions were replaced by the radiograph-based coordinates in the TPS and the dose distribution was recalculated. Results: the computations demonstrated that the radiograph-based dosimetric values for the prostate (D p 90, V p 100, and V p 400) were on average lower than the values of the CT postplan. Normalized to the CT postplan the following mean values were found: D p 90: 90.6% (standard deviation [SD]: 9.0%), V p 100: 86.1% (SD: 14.7%), and V p 400: 79.4% (SD: 14.4%). For three out of the eight patients the D p 90 decreased to 90% of the initial CT postplan values. The reason for this dosimetric difference is supposed to be evoked by an error of the reconstruction software used. It was detected that the TPS algorithm assigned some sources to wrong coordinates, partly out of the prostate gland. Conclusion: the radiograph-based postplanning technique of the investigated TPS should only be used in combination with CT postplanning. Furthermore, complex testing procedures of reconstruction algorithms are recommended to minimize calculation errors. (orig.)

  10. Software metrics to improve software quality in HEP

    International Nuclear Information System (INIS)

    Lancon, E.

    1996-01-01

    The ALEPH reconstruction program maintainability has been evaluated with a case tool implementing an ISO standard methodology based on software metrics. It has been found that the overall quality of the program is good and has shown improvement over the past five years. Frequently modified routines exhibits lower quality; most buys were located in routines with particularly low quality. Implementing from the beginning a quality criteria could have avoided time losses due to bug corrections. (author)

  11. International Liability Issues for Software Quality

    National Research Council Canada - National Science Library

    Mead, Nancy

    2003-01-01

    This report focuses on international law related to cybercrime, international information security standards, and software liability issues as they relate to information security for critical infrastructure applications...

  12. Researches and commercialization of food irradiation technology in China

    International Nuclear Information System (INIS)

    Gao Meixu; Ha Yiming; Chen Hao; Liu Chunquan; Chen Xiulan

    2007-01-01

    The status of food irradiation on research, standard and commercialization is described in the paper. The main research fields now include degradation of chloramphenicol residue by irradiation, promoting safety of meat products, frozen seafood and ready-to-eat products by irradiation, lower activity of allergic protein by irradiation, identification of irradiated food and irradiation as a phytosanitary treatment. The existed standards need to be revised, and new standard need to be established. The commercialization stages of food irradiation and quality assurance system of irradiation company are also analyzed. (authors)

  13. Validation of quality control tests of a multi leaf collimator using electronic portal image devices and commercial software; Validacion de unas pruebas de control de calidad del colimador multilamina utilizando dispositivos electronicos de imagen portal y una aplicacion comercial

    Energy Technology Data Exchange (ETDEWEB)

    Latorre-Musoll, A.; Jornet Sala, N.; Carrasco de Fez, P.; Edualdo Puell, T.; Ruiz Martinez, A.; Ribas Morales, M.

    2013-07-01

    We describe a daily quality control procedure of the multi leaf collimator (MLC) based on electronic portal image devices and commercial software. We designed tests that compare portal images of a set of static and dynamic MLC configurations to a set of reference images using commercial portal dosimetry software. Reference images were acquired using the same set of MLC configurations after the calibration of the MLC. To assess the sensitivity to detect MLC under performances, we modified the MLC configurations by inserting a range of leaf position and speed errors. Distance measurements on portal images correlated with leaf position errors down to 0.1 mm in static MLC configurations. Dose differences between portal images correlated both with speed errors down to 0.5% of the nominal leaf velocities and with leaf position errors down to 0.1 mm in dynamic MLC configurations. The proposed quality control procedure can assess static and dynamic MLC configurations with high sensitivity and reliability. (Author)

  14. Nested Cohort - R software package

    Science.gov (United States)

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  15. Petroleum software and the road ahead

    International Nuclear Information System (INIS)

    Heggelund, D.G.

    1996-01-01

    Regardless of what software vendors want to do, in the end, it is the user, through his/her choices of software products, who will decide what the software of the petroleum-engineering industry will look like. In this article, the author has looked at several items that will impact the future of petroleum-engineering software. Out of these, two will stand out: (1) the adoption of a single integrated dynamic reservoir model and (2) the move to a client/server architecture. However, the biggest challenge for both vendors and users will be to manage change. This will require users to participate more actively in the development of new technology and to be willing to pay for it, and it will require vendors to adopt standards more readily

  16. 10 CFR 431.402 - Preemption of State regulations for commercial HVAC & WH products.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Preemption of State regulations for commercial HVAC & WH... regulations for commercial HVAC & WH products. Beginning on the effective date of such standard, an energy conservation standard set forth in this Part for a commercial HVAC & WH product supersedes any State or local...

  17. Software-Defined Radio for Wireless Local-Area Networks

    NARCIS (Netherlands)

    Schiphorst, Roelof

    2004-01-01

    New wireless communications standards do not replace old ones, instead the number of standards keeps on increasing and by now an abundance of standards exists. Moreover there is no reason to assume that this trend will ever stop. Therefore, the software-radio concept is emerging as a potential

  18. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  19. Net-VISA used as a complement to standard software at the CTBTO: initial operational experience with next-generation software.

    Science.gov (United States)

    Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.

    2017-12-01

    The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.

  20. Improving the Agency's Software Acquisition Capability

    Science.gov (United States)

    Hankinson, Allen

    2003-01-01

    External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).

  1. Computer systems and software description for Standard-E+ Hydrogen Monitoring System (SHMS-E+)

    International Nuclear Information System (INIS)

    Tate, D.D.

    1997-01-01

    The primary function of the Standard-E+ Hydrogen Monitoring System (SHMS-E+) is to determine tank vapor space gas composition and gas release rate, and to detect gas release events. Characterization of the gas composition is needed for safety analyses. The lower flammability limit, as well as the peak burn temperature and pressure, are dependent upon the gas composition. If there is little or no knowledge about the gas composition, safety analyses utilize compositions that yield the worst case in a deflagration or detonation. Knowledge of the true composition could lead to reductions in the assumptions and therefore there may be a potential for a reduction in controls and work restrictions. Also, knowledge of the actual composition will be required information for the analysis that is needed to remove tanks from the Watch List. Similarly, the rate of generation and release of gases is required information for performing safety analyses, developing controls, designing equipment, and closing safety issues. This report outlines the computer system design layout description for the Standard-E+ Hydrogen Monitoring System

  2. Software tool for physics chart checks.

    Science.gov (United States)

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  3. Methodical Approaches to Risk Management in a Regional Commercial Bank

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Altukhova

    2016-03-01

    Full Text Available The article presents the results of the research of the methodological and information infrastructure of the integrated risk management in a regional commercial bank. Within the study of the general development tendencies of the regional banking services market, the most significant risks for a regional bank are revealed. The analysis is carried out on the basis of the stress testing technique developed at the Plekhanov Russian University of Economics. It is based on a technique of dynamic economic and mathematical modeling with the application of information technologies. The created combination of the methodological and instrumental tools allows to carry out the dynamic scenario analysis of the activity of a commercial bank for the identification of potential risks and for the development of the strategy of financial management reducing the potential risks and leveling the consequences of their realization. The received tool allows during the computer test to watch the predicted dynamics of the condition of the key indicators of the activity of a regional commercial bank changing under the influence of the exogenous regulatory measures and instruments of bank management applied to decrease risk and at the same time to introduce adjustments in the perspective strategy of management. As the result of the analysis, the universal management model of the main bank risks in a regional commercial bank within three alternative scenarios is created. The software product allowing to develop and acquire the practical skills of the students in banking is developed. It also may help to develop the methodological support for the regulation of the organizational procedures of risk management in a regional commercial bank. The received software product may be used in a system of the improving the professional skills, and also for obtaining the expected data in a risk management system in a regional commercial bank.

  4. Software life after in-service

    International Nuclear Information System (INIS)

    Tseng, M.; Eng, P.

    1993-01-01

    Software engineers and designers tend to conclude a software project at the in-service milestone of the software life cycle. But the reality is that the 'life after in-service' is significantly longer than other phases of the life cycle, typically 20 years or more depending on the maintainability of the hardware platform and the designed life of the plant. During this period, the software asset (as with other physical assets in the plant) continues to be upgraded to correct deficiencies, meet new requirements, cope with obsolescence of equipment and so on. The software life cycle ends with a migration of the software to a different platform. It is typical in a software development project to put a great deal of emphasis on design methodologies, techniques, tools, development environment, standard procedures, and project management to ensure quality product is delivered on schedule and within budget. More often than not, a disproportion of emphasis is placed on the issues and needs of the in-service phase. Once the software is in-service, the designers move on to other projects, while the maintenance and support staff must manage the software. This paper examines the issues in three steps. First it presents a view of software from maintenance and support staff perspectives, including complexity of software, suitability of documentation, configuration management, training, difficulties and risks associated with making changes, required skills and knowledge. Second, it identifies the concerns raised from these viewpoints, including costs of maintaining the software, ability to meet additional requirements, availability of support tools, length of time required to engineer and install changes, and a strategy for the migration of software asset. Finally it discusses some approaches to deal with the concerns. (Author) 5 refs., fig

  5. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  6. Software platform virtualization in chemistry research and university teaching.

    Science.gov (United States)

    Kind, Tobias; Leamy, Tim; Leary, Julie A; Fiehn, Oliver

    2009-11-16

    Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories. Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs. Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.

  7. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  8. Astronomy Student Activities Using Stellarium Software

    Science.gov (United States)

    Benge, Raymond D.; Tuttle, S. R.

    2012-01-01

    Planetarium programs can be used to provide a valuable learning experience for introductory astronomy students. Educational activities can be designed to utilize the capabilities of the software to display the sky, coordinates, motions in the sky, etc., in order to learn basic astronomical concepts. Most of the major textbook publishers have an option of bundling planetarium software and even laboratory activities using such software with textbooks. However, commercial planetarium software often is updated on a different schedule from the textbook revision and new edition schedule. The software updates also sometimes occur out of sync with college textbook adoption deadlines. Changes in software and activity curriculum often translate into increases costs for students and the college. To provide stability to the process, faculty at Tarrant County College have developed a set of laboratory exercises, entitled Distant Nature, using free open source Stellarium software. Stellarium is a simple, yet powerful, program that is available in formats that run on a variety of operating systems (Windows, Apple, linux). A web site was developed for the Distant Nature activities having a set version of Stellarium that students can download and install on their own computers. Also on the web site, students can access the instructions and worksheets associated with the various Stellarium based activities. A variety of activities are available to support two semesters of introductory astronomy. The Distant Nature web site has been used for one year with Tarrant County College astronomy students and is now available for use by other institutions. The Distant Nature web site is http://www.stuttle1.com/DN_Astro/index.html .

  9. Unattended mode monitoring of passive neutron coincidence detector systems using a commercial data logger

    International Nuclear Information System (INIS)

    Smith, B.G.R.; Outram, J.D.; Storey, M.

    1991-01-01

    A commercial Data Logger for unattended passive neutron coincidence data acquisition is described. This consists of an inexpensive commercial Data Logging equipment attached to a neutron coincidence electronics and a software package for data review. The Data Logger permits both the flexible configuration of a passive neutron coincidence measurement system for unattended mode monitoring and the storage of the measured Totals and Reals count rates. An additional feature of the Data Logger is a custom software package providing for the complete analysis of the stored data and yielding an assay of each item passing through the measurement cavity. The analysis includes an input for different isotopic compositions, the calculation of the multiplication corrected Reals rates, the inclusion of a calibration functions, and the determination of 240 Pu masses. The software package for data review displays the Totals and Reals count rates logged by the Data Logger as a function of time. In addition the custom software provides input files to the data review package to display the multiplication corrected Reals count rates and the measured 240 Pu masses as a function of time. Information on the Data Logger is presented along with the monitoring mode specifications. The analysis functions implemented are described as is the data review software. Results are presented for a specific application

  10. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  11. Design and Implementation of a Mobile Phone Locator Using Software Defined Radio

    National Research Council Canada - National Science Library

    Larsen, Ian P

    2007-01-01

    ...) signal using software defined radio and commodity computer hardware. Using software designed by the GNU free software project as a base, standard GSM packets were transmitted and received over the air, and their arrival times detected...

  12. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  13. Software for imaging phase-shift interference microscope

    Science.gov (United States)

    Malinovski, I.; França, R. S.; Couceiro, I. B.

    2018-03-01

    In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.

  14. Trends in Literacy Software Publication and Marketing: Multicultural Themes.

    Science.gov (United States)

    Balajthy, Ernest

    This article provides data and discussion of multicultural theme-related issues arising from analysis of a detailed database of commercial software products targeted to reading and literacy education. The database consisted of 1152 titles, representing the offerings of 104 publishers and distributors. Of the titles, 62 were identified as having…

  15. Runtime Performance Monitoring Tool for RTEMS System Software

    Science.gov (United States)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  16. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    Science.gov (United States)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  17. Software upgrade for the DIII-D neutral beam control systems

    International Nuclear Information System (INIS)

    Cummings, J.W.; Thurgood, P.A.

    1992-01-01

    This paper reports on the Neutral Beam Software Upgrade project which was launched in early 1990. The major goals were to upgrade the MAC IV operating system to the latest revision (K.1), use standard MODCOMP software (as much as possible), and to develop a very user friendly, versatile system. Accomplishing these goals required new software to be developed and modifications to existing applications software to make it compatible with the latest operating system. The custom operating system modules to handle the message service and interrupt handling were replaced by the standard MODCOMP Inter Task Communication (ITC) and interrupt routines that are part of the MAX IV operating system. The message service provides the mechanism for doing shot task sequencing (task scheduling). The interrupt routines are used to connect external irterrupts to the system

  18. Selection of bioprocess simulation software for industrial applications.

    Science.gov (United States)

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues. Copyright 2001 John Wiley & Sons, Inc.

  19. Quality Parameters for Commercial Royal Jelly

    Directory of Open Access Journals (Sweden)

    Carmen Ioana Muresan

    2016-01-01

    Full Text Available Royal jelly has become a high-value commercial product and the standardization of this product is required to guarantee its quality on the market. The objective of the research activity was to pursue the chemical composition of commercial samples of Royal Jelly in Romania in order to propose standardization for this product. The physico-chemical composition of commercial Royal Jelly samples was analysed by determining quality parameters like: carbohydrates, lipids, proteins, 10-hydroxy-2-decenoic acid (10-HDA and mineral elements. Carbohydrates analysis showed values between 3.4 % and 5.87 % for fructose, 4.12 % and 7.05 % for glucose, while for sucrose the values ranged between 0.95 % and 2.56 % (determined by HPLC-RI. The lipids content ranged between 1.85 % and 6.32 % (determined by the Soxhlet method. The protein values extended from 13.10 % (RJ2 to 17.04 % (RJ10 (the total protein content was determined by the Kjeldahl method. The values for the major fatty acid in Royal Jelly, 10-HDA, ranged between 1.35 % (RJ8 and 2.03 % (RJ10 (determined by high-performance liquid chromatography. The concentration of minerals varied between 3188.70 mg/kg and 4023.39 mg/kg (the concentration of minerals was measured using flame atomic absorption spectrometry. Potassium, followed by magnesium, sodium and calcium, occurs in the highest concentrations. The commercial Royal Jelly samples analysed presented variable physico-chemical characteristics that correspond with the values given by international quality standard proposals for Royal Jelly.

  20. Energy and Energy Cost Savings Analysis of the IECC for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jian; Athalye, Rahul A.; Hart, Philip R.; Rosenberg, Michael I.; Xie, YuLong; Goel, Supriya; Mendon, Vrushali V.; Liu, Bing

    2013-08-30

    The purpose of this analysis is to assess the relative energy and energy cost performance of commercial buildings designed to meet the requirements found in the commercial energy efficiency provisions of the International Energy Conservation Code (IECC). Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. As many states have historically adopted the IECC for both residential and commercial buildings, PNNL has evaluated the impacts of the commercial provisions of the 2006, 2009, and 2012 editions of the IECC. PNNL also compared energy performance with corresponding editions of ANSI/ASHRAE/IES Standard 90.1 to help states and local jurisdictions make informed decisions regarding model code adoption.

  1. Challenges for emerging new electronics standards for physics

    International Nuclear Information System (INIS)

    Larsen, R.S.

    2012-01-01

    A unique effort is underway between industry and the international physics community to extend the Telecom industry's Advanced Telecommunications Computing Architecture (ATCA and MicroTCA) to meet future needs of the physics machine controls, instrumentation and detector communities. New standard extensions for physics are described which have been designed to deliver unprecedented performance and high subsystem availability for accelerator controls, instrumentation and data acquisition. Key technical features include an out-of-band imbedded standard Intelligent Platform Management Interface (IPMI) system to manage hot-swap module replacement and hardware-software fail-over. New software standards or guidelines are in development which will extend the reach of platform independent software standards to simplify design of low level drivers. Efforts to make the new standards broadly available in the marketplace through lab-industry collaboration are discussed. (author)

  2. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  3. Testing existing software for safety-related applications. Revision 7.1

    International Nuclear Information System (INIS)

    Scott, J.A.; Lawrence, J.D.

    1995-12-01

    The increasing use of commercial off-the-shelf (COTS) software products in digital safety-critical applications is raising concerns about the safety, reliability, and quality of these products. One of the factors involved in addressing these concerns is product testing. A tester's knowledge of the software product will vary, depending on the information available from the product vendor. In some cases, complete source listings, program structures, and other information from the software development may be available. In other cases, only the complete hardware/software package may exist, with the tester having no knowledge of the internal structure of the software. The type of testing that can be used will depend on the information available to the tester. This report describes six different types of testing, which differ in the information used to create the tests, the results that may be obtained, and the limitations of the test types. An Annex contains background information on types of faults encountered in testing, and a Glossary of pertinent terms is also included. This study is pertinent for safety-related software at reactors

  4. Designing the modern pump: engineering aspects of continuous subcutaneous insulin infusion software.

    Science.gov (United States)

    Welsh, John B; Vargas, Steven; Williams, Gary; Moberg, Sheldon

    2010-06-01

    Insulin delivery systems attracted the efforts of biological, mechanical, electrical, and software engineers well before they were commercially viable. The introduction of the first commercial insulin pump in 1983 represents an enduring milestone in the history of diabetes management. Since then, pumps have become much more than motorized syringes and have assumed a central role in diabetes management by housing data on insulin delivery and glucose readings, assisting in bolus estimation, and interfacing smoothly with humans and compatible devices. Ensuring the integrity of the embedded software that controls these devices is critical to patient safety and regulatory compliance. As pumps and related devices evolve, software engineers will face challenges and opportunities in designing pumps that are safe, reliable, and feature-rich. The pumps and related systems must also satisfy end users, healthcare providers, and regulatory authorities. In particular, pumps that are combined with glucose sensors and appropriate algorithms will provide the basis for increasingly safe and precise automated insulin delivery-essential steps to developing a fully closed-loop system.

  5. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  6. The Challenges of the "Software Support for Industrial Controls" Contract

    CERN Document Server

    Ninin, P

    2000-01-01

    ST division is currently specifying a 'Software Support for Industrial Controls' contract. The application of this contract and its success will require several changes in our habits for specifying, designing, and maintaining control systems. This paper summarizes some key concepts which should be respected in order to obtain maximum benefits from the future contract and to optimize the software activities in the division. The contract concerns the maintenance and development of the monitoring and control systems used for supervising CERN's technical infrastructure (electrical distribution, cooling water, air conditioning, safety, and access control). The systems concerned consist of computer and communication hardware and software, tailored to provide specific functionalities for the remote operation, command, and monitoring of equipment. All these systems use commercially available software and hardware such as SCADA, PLCs and associated drivers, controllers, fieldbuses, and networks. It is intended to cont...

  7. Management models in the NZ software industry

    Directory of Open Access Journals (Sweden)

    Holger Spill

    Full Text Available This research interviewed eight innovative New Zealand software companies to find out how they manage new product development. It looked at how management used standard techniques of software development to manage product uncertainty through the theoretical lens of the Cyclic Innovation Model. The study found that while there is considerable variation, the management of innovation was largely determined by the level of complexity. Organizations with complex innovative software products had a more iterative software development style, more flexible internal processes and swifter decision-making. Organizations with less complexity in their products tended to use more formal structured approaches. Overall complexity could be inferred with reference to four key factors within the development environment.

  8. Engineering nonlinearity characteristic compensation for commercial steam turbine control valve using linked MARS code and Matlab Simulink

    International Nuclear Information System (INIS)

    Halimi, B.; Suh, Kune Y.

    2012-01-01

    Highlights: ► A nonlinearity characteristic compensation is proposed of the steam turbine control valve. ► A steady state and transient analyzer is developed of Ulchin Units 3 and 4 OPR1000 nuclear plants. ► MARS code and Matlab Simulink are used to verify the compensation concept. ► The results show the concept can compensate for the nonlinearity characteristic very well. - Abstract: Steam turbine control valves play a pivotal role in regulating the output power of the turbine in a commercial power plant. They thus have to be operated linearly to be run by an automatic control system. Unfortunately, the control valve has inherently nonlinearity characteristics. The flow increases more significantly near the closed end than near the open end of the stem travel given the valve position signal. The steam flow should nonetheless be proportional to the final desired quantity, output power, of the turbine to obtain a linear operation. This paper presents the valve engineering linked analysis (VELA) for nonlinearity characteristic compensation of the steam turbine control valve by using a linked two existing commercial software. The Multi-dimensional Analysis of Reactor Safety (MARS) code and Matlab Simulink have been selected for VELA to develop a steady state and transient analyzer of Ulchin Units 3 and 4 powered by the Optimized Power Reactor 1000 MWe (OPR1000). MARS is capable of modeling a wide range of systems from single pipes to full nuclear power plants. As one of standard nuclear power plant thermal hydraulic analysis software tools, MARS simulates the primary and secondary sides of the nuclear power plant. To simulate the electric power flow part, Matlab Simulink is chosen as the standard analysis software. Matlab Simulink having an interactive environment to model analyzes and simulates a wide variety of engineering dynamic systems including multimachine power systems. Based on the MARS code result, Matlab Simulink analyzes the power flow of the

  9. Standard Populations (Millions) for Age-Adjustment - SEER Population Datasets

    Science.gov (United States)

    Download files containing standard population data for use in statististical software. The files contain the same data distributed with SEER*Stat software. You can also view the standard populations, either 19 age groups or single ages.

  10. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  11. [Development of integrated support software for clinical nutrition].

    Science.gov (United States)

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  12. Development of integrated support software for clinical nutrition

    Directory of Open Access Journals (Sweden)

    Pedro Siquier Homar

    2015-09-01

    Full Text Available Objectives: to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. Methods: the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. Results: this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. Conclusions: this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer

  13. PACMAN: PRIMA astrometric instrument software

    Science.gov (United States)

    Abuter, Roberto; Sahlmann, Johannes; Pozna, Eszter

    2010-07-01

    The dual feed astrometric instrument software of PRIMA (PACMAN) that is currently being integrated at the VLTI will use two spatially modulated fringe sensor units and a laser metrology system to carry out differential astrometry. Its software and hardware compromises a distributed system involving many real time computers and workstations operating in a synchronized manner. Its architecture has been designed to allow the construction of efficient and flexible calibration and observation procedures. In parallel, a novel scheme of integrating M-code (MATLAB/OCTAVE) with standard VLT (Very Large Telescope) control software applications had to be devised in order to support numerically intensive operations and to have the capacity of adapting to fast varying strategies and algorithms. This paper presents the instrument software, including the current operational sequences for the laboratory calibration and sky calibration. Finally, a detailed description of the algorithms with their implementation, both under M and C code, are shown together with a comparative analysis of their performance and maintainability.

  14. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  15. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  16. IAEA/NDS requirements related to database software

    International Nuclear Information System (INIS)

    Pronyaev, V.; Zerkin, V.

    2001-01-01

    Full text: The Nuclear Data Section of the IAEA disseminates data to the NDS users through Internet or on CD-ROMs and diskettes. OSU Web-server on DEC Alpha with Open VMS and Oracle/DEC DBMS provides via CGI scripts and FORTRAN retrieval programs access to the main nuclear databases supported by the networks of Nuclear Reactions Data Centres and Nuclear Structure and Decay Data Centres (CINDA, EXFOR, ENDF, NSR, ENSDF). For Web-access to data from other libraries and files, hyper-links to the files stored in ASCII text or other formats are used. Databases on CD-ROM are usually provided with some retrieval system. They are distributed in the run-time mode and comply with all license requirements for software used in their development. Although major development work is done now at the PC with MS-Windows and Linux, NDS may not at present, due to some institutional conditions, use these platforms for organization of the Web access to the data. Starting the end of 1999, the NDS, in co-operation with other data centers, began to work out the strategy of migration of main network nuclear data bases onto platforms other than DEC Alpha/Open VMS/DBMS. Because the different co-operating centers have their own preferences for hardware and software, the requirement to provide maximum platform independence for nuclear databases is the most important and desirable feature. This requirement determined some standards for the nuclear database software development. Taking into account the present state and future development, these standards can be formulated as follows: 1. All numerical data (experimental, evaluated, recommended values and their uncertainties) prepared for inclusion in the IAEA/NDS nuclear database should be submitted in the form of the ASCII text files and will be kept at NDS as a master file. 2. Databases with complex structure should be submitted in the form of the files with standard SQL statements describing all its components. All extensions of standard SQL

  17. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  18. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    Science.gov (United States)

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  19. Satellite Communications Using Commercial Protocols

    Science.gov (United States)

    Ivancic, William D.; Griner, James H.; Dimond, Robert; Frantz, Brian D.; Kachmar, Brian; Shell, Dan

    2000-01-01

    NASA Glenn Research Center has been working with industry, academia, and other government agencies in assessing commercial communications protocols for satellite and space-based applications. In addition, NASA Glenn has been developing and advocating new satellite-friendly modifications to existing communications protocol standards. This paper summarizes recent research into the applicability of various commercial standard protocols for use over satellite and space- based communications networks as well as expectations for future protocol development. It serves as a reference point from which the detailed work can be readily accessed. Areas that will be addressed include asynchronous-transfer-mode quality of service; completed and ongoing work of the Internet Engineering Task Force; data-link-layer protocol development for unidirectional link routing; and protocols for aeronautical applications, including mobile Internet protocol routing for wireless/mobile hosts and the aeronautical telecommunications network protocol.

  20. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.