WorldWideScience

Sample records for advanced software testing

  1. Trends in software testing

    CERN Document Server

    Mohanty, J; Balakrishnan, Arunkumar

    2017-01-01

    This book is focused on the advancements in the field of software testing and the innovative practices that the industry is adopting. Considering the widely varied nature of software testing, the book addresses contemporary aspects that are important for both academia and industry. There are dedicated chapters on seamless high-efficiency frameworks, automation on regression testing, software by search, and system evolution management. There are a host of mathematical models that are promising for software quality improvement by model-based testing. There are three chapters addressing this concern. Students and researchers in particular will find these chapters useful for their mathematical strength and rigor. Other topics covered include uncertainty in testing, software security testing, testing as a service, test technical debt (or test debt), disruption caused by digital advancement (social media, cloud computing, mobile application and data analytics), and challenges and benefits of outsourcing. The book w...

  2. Earth Observing System (EOS)/ Advanced Microwave Sounding Unit-A (AMSU-A): Special Test Equipment. Software Requirements

    Science.gov (United States)

    Schwantje, Robert

    1995-01-01

    This document defines the functional, performance, and interface requirements for the Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A) Special Test Equipment (STE) software used in the test and integration of the instruments.

  3. Software Testing

    Science.gov (United States)

    1977-11-15

    CATALOG NUMBER 4. TITLE (and Subtitle) . TYPE OF REPO RI.O COVERED_ US ARMY TEST AND EVALUATION COMMND TEST OF R&TN CEDURE Final------- SOFWAR TEST...verification, the TECON field activity A should offer to provide this service using the CRWG and the TIWG as vehicles for coordination. 2.2 Algorithm... services and controls the applications programs. Among its many functions are dispatching and scheduling of tasks; allocat- ing and freeing of memory

  4. Software Testing Techniques and Strategies

    OpenAIRE

    Isha,; Sunita Sangwan

    2014-01-01

    Software testing provides a means to reduce errors, cut maintenance and overall software costs. Numerous software development and testing methodologies, tools, and techniques have emerged over the last few decades promising to enhance software quality. This paper describes Software testing, need for software testing, Software testing goals and principles. Further it describe about different Software testing techniques and different software testing strategies.

  5. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  6. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  7. Evaluating software testing strategies

    Science.gov (United States)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  8. Earth Observing System (EOS) Advanced Microwave Sounding Unit-A2 (EOS/AMSU-A): EOS Software Test Report

    Science.gov (United States)

    1998-01-01

    This document describes the results of the formal qualification test (FQT)/ Demonstration conducted on September 10, and 14, 1998 for the EOS AMSU-A2 instrument. The purpose of the report is to relate the results of the functional performance and interface tests of the software. This is the final submittal of the EOS/AMSU-A Software Test report.

  9. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-07-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  10. Software Testing as Science

    Directory of Open Access Journals (Sweden)

    Ingrid Gallesdic

    2013-06-01

    Full Text Available The most widespread opinion among people who have some connection with software testing is that this activity is an art. In fact, books have been published widely whose titles refer to it as art, role or process. But because software complexity is increasing every year, this paper proposes a new approach, conceiving the test as a science. This is because the processes by which they are applied are the steps of the scientific method: inputs, processes, outputs. The contents of this paper examines the similarities and test characteristics as science.

  11. Design of a test system for the development of advanced video chips and software algorithms.

    Science.gov (United States)

    Falkinger, Marita; Kranzfelder, Michael; Wilhelm, Dirk; Stemp, Verena; Koepf, Susanne; Jakob, Judith; Hille, Andreas; Endress, Wolfgang; Feussner, Hubertus; Schneider, Armin

    2015-04-01

    Visual deterioration is a crucial point in minimally invasive surgery impeding surgical performance. Modern image processing technologies appear to be promising approaches for further image optimization by digital elimination of disturbing particles. To make them mature for clinical application, an experimental test environment for evaluation of possible image interferences would be most helpful. After a comprehensive review of the literature (MEDLINE, IEEE, Google Scholar), a test bed for generation of artificial surgical smoke and mist was evolved. Smoke was generated by a fog machine and mist produced by a nebulizer. The size of resulting droplets was measured microscopically and compared with biological smoke (electrocautery) and mist (ultrasound dissection) emerging during minimally invasive surgical procedures. The particles resulting from artificial generation are in the range of the size of biological droplets. For surgical smoke, the droplet dimension produced by the fog machine was 4.19 µm compared with 4.65 µm generated by electrocautery during a surgical procedure. The size of artificial mist produced by the nebulizer ranged between 45.38 and 48.04 µm compared with the range between 30.80 and 56.27 µm that was generated during minimally invasive ultrasonic dissection. A suitable test bed for artificial smoke and mist generation was developed revealing almost identical droplet characteristics as produced during minimally invasive surgical procedures. The possibility to generate image interferences comparable to those occurring during laparoscopy (electrocautery and ultrasound dissection) provides a basis for the future development of image processing technologies for clinical applications. © The Author(s) 2014.

  12. Test af Software

    DEFF Research Database (Denmark)

    Dette dokument udgør slutrapporten for netværkssamarbejdet ”Testnet”, som er udført i perioden 1.4.2006 til 31.12.2008. Netværket beskæftiger sig navnlig med emner inden for test af indlejret og teknisk software, men et antal eksempler på problemstillinger og løsninger forbundet med test af...... administrativ software indgår også. Rapporten er opdelt i følgende 3 dele: Overblik. Her giver vi et resumé af netværkets formål, aktiviteter og resultater. State of the art af software test ridses op. Vi omtaler, at CISS og netværket tager nye tiltag. Netværket. Formål, deltagere og behandlede emner på ti...

  13. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  14. TestingScientificSoftware.pdf

    OpenAIRE

    Dubey, Anshu

    2017-01-01

    Testing scientific software is critical for producing credible results and for code maintenance. The IDEAS scientific software productivity project aims toward increasing software productivity and sustainability, with participants from many projects that define the state of practice in software engineering in the HPC community. This tutorial distills the combined knowledge of IDEAS team members in the area of scientific software testing.

  15. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  16. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    Science.gov (United States)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  17. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  18. Research on Software Security Testing

    OpenAIRE

    Gu Tian-yang; Shi Yin-sheng; Fang You-yuan

    2010-01-01

    Software security testing is an important means to ensure software security and trustiness. This paper first mainly discusses the definition and classification of software security testing, and investigates methods and tools of software security testing widely. Then it analyzes and concludes the advantages and disadvantages of various methods and the scope of application, presents a taxonomy of security testing tools. Finally, the paper points out future focus and development directions of so...

  19. Software Testing Requires Variability

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    Software variability is the ability of a software system or artefact to be changed, customized or configured for use in a particular context. Variability in software systems is important from a number of perspectives. Some perspectives rightly receive much attention due to their direct economic...... impact in software production. As is also apparent from the call for papers these perspectives focus on qualities such as reuse, adaptability, and maintainability....

  20. Fairness Testing: Testing Software for Discrimination

    OpenAIRE

    Galhotra, Sainyam; Brun, Yuriy; Meliou, Alexandra

    2017-01-01

    This paper defines software fairness and discrimination and develops a testing-based method for measuring if and how much software discriminates, focusing on causality in discriminatory behavior. Evidence of software discrimination has been found in modern software systems that recommend criminal sentences, grant access to financial products, and determine who is allowed to participate in promotions. Our approach, Themis, generates efficient test suites to measure discrimination. Given a sche...

  1. How to test bioinformatics software?

    Science.gov (United States)

    Kamali, Amir Hossein; Giannoulatou, Eleni; Chen, Tsong Yueh; Charleston, Michael A; McEwan, Alistair L; Ho, Joshua W K

    2015-09-01

    Bioinformatics is the application of computational, mathematical and statistical techniques to solve problems in biology and medicine. Bioinformatics programs developed for computational simulation and large-scale data analysis are widely used in almost all areas of biophysics. The appropriate choice of algorithms and correct implementation of these algorithms are critical for obtaining reliable computational results. Nonetheless, it is often very difficult to systematically test these programs as it is often hard to verify the correctness of the output, and to effectively generate failure-revealing test cases. Software testing is an important process of verification and validation of scientific software, but very few studies have directly dealt with the issues of bioinformatics software testing. In this work, we review important concepts and state-of-the-art methods in the field of software testing. We also discuss recent reports on adapting and implementing software testing methodologies in the bioinformatics field, with specific examples drawn from systems biology and genomic medicine.

  2. HPC Software Stack Testing Framework

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-27

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  3. Testing for Software Safety

    Science.gov (United States)

    Chen, Ken; Lee, Yann-Hang; Wong, W. Eric; Xu, Dianxiang

    2007-01-01

    This research focuses on testing whether or not the hazardous conditions identified by design-level fault tree analysis will occur in the target implementation. Part 1: Integrate fault tree models into functional specifications so as to identify testable interactions between intended behaviors and hazardous conditions. Part 2: Develop a test generator that produces not only functional tests but also safety tests for a target implementation in a cost-effective way. Part 3: Develop a testing environment for executing generated functional and safety tests and evaluating test results against expected behaviors or hazardous conditions. It includes a test harness as well as an environment simulation of external events and conditions.

  4. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types...

  5. Comparing the effectiveness of software testing strategies

    Science.gov (United States)

    Basili, Victor R.; Selby, Richard W.

    1987-01-01

    This study compares the results of code reading, functional testing, and structural testing in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. Thirty two professional programmers and 42 advanced students applied the three techniques to four unit-sized programs in a fractional experimental design. The major results of this study are the following: (1) With the professional programmers, code reading detected more software faults and had a higher detection rate than did functional or structural testing, while functional testing detected more faults than did structural testing, but functional and structural testing were not different in fault detection rate. (2) In one advanced student subject group, code reading and functional testing were not different in faults found, but were superior to structural testing, while in the other advanced student subject group there was no difference among the techniques. (3) With the advanced student subjects, the three techniques were not different in fault deteciton rate. (4) Number of faults observed, fault detection rate, and total effort in detection depended on the type of software tested. (5) Code reading detected more interface faults than did the other methods. (6) Functional testing detected more control faults than did the other methods. (7) When asked to estimate the percentage of faults detected, code readers gave the most accurate estimates while functional testers gave the least accurate estimates. Appendix B includes the source code for the word.

  6. The School Advanced Ventilation Engineering Software (SAVES)

    Science.gov (United States)

    The School Advanced Ventilation Engineering Software (SAVES) package is a tool to help school designers assess the potential financial payback and indoor humidity control benefits of Energy Recovery Ventilation (ERV) systems for school applications.

  7. Advanced Modular Software Performance Monitoring

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The LHCb software is based on the Gaudi framework, on top of which are built several large and complex software applications. The LHCb experiment is now in the active phase of collecting and analyzing data and significant performance problems arise in the Gaudi based software beginning from High Level Trigger (HLT) programs and ending with data analysis frameworks (DaVinci). It’s not easy to find hot spots in the code - only special tools can help to understand where CPU or memory usage is not reasonable. There exist many performance analyzing tools, but the main problem is that they show reports in terms of class and function names and such information usually is not very useful - the majority of algorithm developers use the Gaudi framework abstractions and usually do not know about functions which lie at the lower level. We will show a new approach which adds to performance reports a higher abstraction level based on knowledge of framework architecture and run-time object properties. A set of profiling to...

  8. Advances in software science and technology

    CERN Document Server

    Hikita, Teruo; Kakuda, Hiroyasu

    1993-01-01

    Advances in Software Science and Technology, Volume 4 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 10 chapters, this volume begins with an overview of the historical survey of programming languages for vector/parallel computers in Japan and describes compiling methods for supercomputers in Japan. This text then explains the model of a Japanese software factory, which is presented by the logical configuration that has been satisfied by

  9. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  10. Software testing in roughness calculation

    International Nuclear Information System (INIS)

    Chen, Y L; Hsieh, P F; Fu, W E

    2005-01-01

    A test method to determine the function quality provided by the software for roughness measurement is presented in this study. The function quality of the software requirements should be part of and assessed through the entire life cycle of the software package. The specific function, or output accuracy, is crucial for the analysis of the experimental data. For scientific applications, however, commercial software is usually embedded with specific instrument, which is used for measurement or analysis during the manufacture process. In general, the error ratio caused by the software would be more apparent especially when dealing with relatively small quantities, like the measurements in the nanometer-scale range. The model of 'using a data generator' proposed by NPL of UK was applied in this study. An example of the roughness software is tested and analyzed by the above mentioned process. After selecting the 'reference results', the 'reference data' was generated by a programmable 'data generator'. The filter function of 0.8 mm long cutoff value, defined in ISO 11562 was tested with 66 sinusoid data at different wavelengths. Test results from commercial software and CMS written program were compared to the theoretical data calculated from ISO standards. As for the filter function in this software, the result showed a significant disagreement between the reference and test results. The short cutoff feature for filtering at the high frequencies does not function properly, while the long cutoff feature has the maximum difference in the filtering ratio, which is more than 70% between the wavelength of 300 μm and 500 μm. Conclusively, the commercial software needs to be tested more extensively for specific application by appropriate design of reference dataset to ensure its function quality

  11. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  12. Learning software testing with Test Studio

    CERN Document Server

    Madi, Rawane

    2013-01-01

    Learning Software Testing with Test Studio is a practical, hands-on guide that will help you get started with Test Studio to design your automated solution and tests. All through the book, there are best practices and tips and tricks inside Test Studio which can be employed to improve your solution just like an experienced QA.If you are a beginner or a professional QA who is seeking a fast, clear, and direct to the point start in automated software testing inside Test Studio, this book is for you. You should be familiar with the .NET framework, mainly Visual Studio, C#, and SQL, as the book's

  13. Advances in software science and technology

    CERN Document Server

    Ohno, Yoshio; Kamimura, Tsutomu

    1991-01-01

    Advances in Software Science and Technology, Volume 2 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into four parts encompassing 12 chapters, this volume begins with an overview of categorical frameworks that are widely used to represent data types in computer science. This text then provides an algorithm for generating vertices of a smoothed polygonal line from the vertices of a digital curve or polygonal curve whose position contains a certain amount of error. O

  14. Advances in software science and technology

    CERN Document Server

    Kakuda, Hiroyasu; Ohno, Yoshio

    1992-01-01

    Advances in Software Science and Technology, Volume 3 provides information pertinent to the advancement of the science and technology of computer software. This book discusses the various applications for computer systems.Organized into two parts encompassing 11 chapters, this volume begins with an overview of the development of a system of writing tools called SUIKOU that analyzes a machine-readable Japanese document textually. This text then presents the conditioned attribute grammars (CAGs) and a system for evaluating them that can be applied to natural-language processing. Other chapters c

  15. Software testing - A way to improve software reliability

    Science.gov (United States)

    Mahindru, Andy

    1986-01-01

    Various software testing techniques are described. The techniques are classified as dynamic or static, structural or functional, and manual or automated. The objects tested include the elements designed during the development of the software, such as codes, data structures, and requirements. Testing techniques and procedures applicable to each phase of software development are examined; the development phases are: software requirements analysis, preliminary design, detailed design, coding, testing, and operation and maintenance. The characteristics of a future software engineering environment for software testing and validation are discussed.

  16. Creating and Testing Simulation Software

    Science.gov (United States)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  17. Advanced Extravehicular Mobility Unit Informatics Software Design

    Science.gov (United States)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  18. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  19. Artificial intelligence and expert systems in-flight software testing

    Science.gov (United States)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  20. Test Software Functionality, but Test its Performance as Well

    OpenAIRE

    Jovica Đurković; Jelica Trninić; Vuk Vuković

    2011-01-01

    Software product testing has great importance in the detection of errors appearing in the course of software development and reflecting directly on software quality enhancement before its implementation in the working environment. Special priority in the software product testing phase is given to testing software performance. In contrast to functional testing, which should show if software is capable of carrying out planned functions without making errors, performance testing should show if t...

  1. Testing Scientific Software: A Systematic Literature Review.

    Science.gov (United States)

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  2. Software Framework for Advanced Power Plant Simulations

    Energy Technology Data Exchange (ETDEWEB)

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  3. Microcomputer Testing Software Teachers Can Use.

    Science.gov (United States)

    Hsu, Tse-chi; Nitko, Anthony J.

    1983-01-01

    Microcomputer software for computer-assisted classroom testing is reviewed. The teacher and classroom are emphasized in applying computer technology. The major issues are identification of appropriate classroom testing microcomputer applications; identification of available microcomputer testing software; techniques for software evaluation; and…

  4. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    Science.gov (United States)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner

  5. Software Testing An ISEB Intermediate Certificate

    CERN Document Server

    Hambling, Brian

    2009-01-01

    Covering testing fundamentals, reviews, testing and risk, test management and test analysis, this book helps newly qualified software testers to learn the skills and techniques to take them to the next level. Written by leading authors in the field, this is the only official textbook of the ISEB Intermediate Certificate in Software Testing.

  6. Software Testing Overview on Different Generalization Levels

    OpenAIRE

    Kuļešovs, Ivans; Arnicane, Vineta; Arnicans, Guntis; Borzovs, Juris

    2013-01-01

    There are many different views on software testing co-exist even within the borders of one organization. That is why we have decided to prepare software testing overview on metalevel indicating main influencers that make this difference. While gathering the details about meta-level elements we have performed some structuring of elements from lower level of software testing such as testing oracles and testing approaches, methods, and techniques. The overview preparation has resu...

  7. Empirical studies on exploratory software testing

    OpenAIRE

    Itkonen, Juha

    2011-01-01

    Exploratory software testing (ET) is a practically relevant approach to software testing that lacks scientific knowledge. In ET, the tester's work is not based on predesigned and documented test cases. Instead, testing is guided by a higher-level plan or mission, and the testing work involves parallel test design, test execution, and learning. One of the distinct characteristics of ET is that the tester designs the tests during ET and uses information gained to design new and better tests con...

  8. TTCN-3 for Distributed Testing Embedded Software

    NARCIS (Netherlands)

    Blom, Stefan; Deiß, Thomas; Ioustinova, Natalia; Kontio, Ari; van de Pol, Jan Cornelis; Rennoch, Axel; Sidorova, Natalia; Virbitskaite, I.; Voronkov, A.

    TTCN-3 is a standardized language for specifying and executing test suites that is particularly popular for testing embedded systems. Prior to testing embedded software in a target environment, the software is usually tested in the host environment. Executing in the host environment often affects

  9. Guide to advanced empirical software engineering

    National Research Council Canada - National Science Library

    Shull, Forrest; Singer, Janice; Sjøberg, Dag I. K

    2008-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Section I Research Methods and Techniques Chapter 1 Software Engineering Data Collection for Field Studies...

  10. Data acquisition and test system software

    International Nuclear Information System (INIS)

    Bourgeois, N.A. Jr.

    1979-03-01

    Sandia Laboratories has been assigned the task by the Base and Installation Security Systems (BISS) Program Office to develop various aspects of perimeter security systems. One part of this effort involves the development of advanced signal processing techniques to reduce the false and nuisance alarms from sensor systems while improving the probability of intrusion detection. The need existed for both data acquisition hardware and software. Also, the hardware is used to implement and test the signal processing algorithms in real time. The hardware developed for this signal processing task is the Data Acquisition and Test System (DATS). The programs developed for use on DATS are described. The descriptions are taken directly from the documentation included within the source programs themselves

  11. Reliability Testing Strategy - Reliability in Software Engineering

    OpenAIRE

    Taylor-Sakyi, Kevin

    2016-01-01

    This paper presents the core principles of reliability in software engineering - outlining why reliability testing is critical and specifying the process of measuring reliability. The paper provides insight for both novice and experts in the software engineering field for assessing failure intensity as well as predicting failure of software systems. Measurements are conducted by utilizing information from an operational profile to further enhance a test plan and test cases, all of which this ...

  12. Software development of CBM well test

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z.; Li, B. [China Coal Research Institute, Xian (China)

    2002-10-01

    In response to the technology development of CBM exploration, and to meet the needs of analysing CBM test wells, an open style analytical software is developed. The paper presents the design concept, technical requirement and the framework of software development. The aim of integrating test well design, data preparation, test well analysis and auto-edit report is realized. 3 figs., 2 tabs.

  13. Software unit testing in Ada environment

    Science.gov (United States)

    Warnock, Glenn

    1986-01-01

    A validation procedure for the Ada binding of the Graphical Kernel System (GKS) is being developed. PRIOR Data Sciences is also producing a version of the GKS written in Ada. These major software engineering projects will provide an opportunity to demonstrate a sound approach for software testing in an Ada environment. The GKS/Ada validation capability will be a collection of test programs and data, and test management guidelines. These products will be used to assess the correctness, completeness, and efficiency of any GKS/Ada implementation. The GKS/Ada developers will be able to obtain the validation software for their own use. It is anticipated that this validation software will eventually be taken over by an independent standards body to provide objective assessments of GKS/Ada implementations, using an approach similar to the validation testing currently applied to Ada compilers. In the meantime, if requested, this validation software will be used to assess GKS/Ada products. The second project, implementation of GKS using the Ada language, is a conventional software engineering tasks. It represents a large body of Ada code and has some interesting testing problems associated with automatic testing of graphics routines. Here the normal test practices which include automated regression testing, independent quality assistance, test configuration management, and the application of software quality metrics will be employed. The software testing methods emphasize quality enhancement and automated procedures. Ada makes some aspects of testing easier, and introduces some concerns. These issues are addressed.

  14. HALOE test and evaluation software

    Science.gov (United States)

    Edmonds, W.; Natarajan, S.

    1987-01-01

    Computer programming, system development and analysis efforts during this contract were carried out in support of the Halogen Occultation Experiment (HALOE) at NASA/Langley. Support in the major areas of data acquisition and monitoring, data reduction and system development are described along with a brief explanation of the HALOE project. Documented listings of major software are located in the appendix.

  15. Validation testing of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Hang Bae; Han, Jae Bok

    1995-01-01

    A software engineering process has been developed for the design of safety critical software for Wolsung 2/3/4 project to satisfy the requirements of the regulatory body. Among the process, this paper described the detail process of validation testing performed to ensure that the software with its hardware, developed by the design group, satisfies the requirements of the functional specification prepared by the independent functional group. To perform the tests, test facility and test software were developed and actual safety system computer was connected. Three kinds of test cases, i.e., functional test, performance test and self-check test, were programmed and run to verify each functional specifications. Test failures were feedback to the design group to revise the software and test results were analyzed and documented in the report to submit to the regulatory body. The test methodology and procedure were very efficient and satisfactory to perform the systematic and automatic test. The test results were also acceptable and successful to verify the software acts as specified in the program functional specification. This methodology can be applied to the validation of other safety-critical software. 2 figs., 2 tabs., 14 refs. (Author)

  16. Simulation Testing of Embedded Flight Software

    Science.gov (United States)

    Shahabuddin, Mohammad; Reinholtz, William

    2004-01-01

    Virtual Real Time (VRT) is a computer program for testing embedded flight software by computational simulation in a workstation, in contradistinction to testing it in its target central processing unit (CPU). The disadvantages of testing in the target CPU include the need for an expensive test bed, the necessity for testers and programmers to take turns using the test bed, and the lack of software tools for debugging in a real-time environment. By virtue of its architecture, most of the flight software of the type in question is amenable to development and testing on workstations, for which there is an abundance of commercially available debugging and analysis software tools. Unfortunately, the timing of a workstation differs from that of a target CPU in a test bed. VRT, in conjunction with closed-loop simulation software, provides a capability for executing embedded flight software on a workstation in a close-to-real-time environment. A scale factor is used to convert between execution time in VRT on a workstation and execution on a target CPU. VRT includes high-resolution operating- system timers that enable the synchronization of flight software with simulation software and ground software, all running on different workstations.

  17. AUTOSIM: An automated repetitive software testing tool

    Science.gov (United States)

    Dunham, J. R.; Mcbride, S. E.

    1985-01-01

    AUTOSIM is a software tool which automates the repetitive run testing of software. This tool executes programming tasks previously performed by a programmer with one year of programming experience. Use of the AUTOSIM tool requires a knowledge base containing information about known faults, code fixes, and the fault diagnosis-correction process. AUTOSIM can be considered as an expert system which replaces a low level of programming expertise. Reference information about the design and implementation of the AUTOSIM software test tool provides flowcharts to assist in maintaining the software code and a description of how to use the tool.

  18. Smells in software test code

    NARCIS (Netherlands)

    Garousi, Vahid; Küçük, Barış

    2018-01-01

    As a type of anti-pattern, test smells are defined as poorly designed tests and their presence may negatively affect the quality of test suites and production code. Test smells are the subject of active discussions among practitioners and researchers, and various guidelines to handle smells are

  19. Research and implementation of software automatic test

    Science.gov (United States)

    Li-hong, LIAN

    2017-06-01

    With the fast development in IT technology nowadays, software is increasingly complex and large. Hundreds of people in the development team, thousands of modules and interfaces, across geographies and systems user are no longer a fresh thing. All of these put forward higher requirements for software testing. Due to the low cost of implementation and the advantage of effective inheritance and accumulation of test assets, software automation testing has gradually become one of the important means to ensure the quality of software for IT enterprises. This paper analyzes the advantages of automatic test, common misconceptions; puts forward unsuitable application scenarios and the best time to intervene; focus on the analysis of the feasibility of judging the interface automation test; and puts forward the function and elements of interface automatic test tools to have; provides a reference for large-scale project interface automated testing tool selection or custom development.

  20. Program Helps Design Tests Of Developmental Software

    Science.gov (United States)

    Hops, Jonathan

    1994-01-01

    Computer program called "A Formal Test Representation Language and Tool for Functional Test Designs" (TRL) provides automatic software tool and formal language used to implement category-partition method and produce specification of test cases in testing phase of development of software. Category-partition method useful in defining input, outputs, and purpose of test-design phase of development and combines benefits of choosing normal cases having error-exposing properties. Traceability maintained quite easily by creating test design for each objective in test plan. Effort to transform test cases into procedures simplified by use of automatic software tool to create cases based on test design. Method enables rapid elimination of undesired test cases from consideration and facilitates review of test designs by peer groups. Written in C language.

  1. Developing LHCb Grid Software: Experiences and Advances

    CERN Document Server

    Stokes-Rees, I; Cioffi, C; Tsaregorodtsev, A; Garonne, V; Graciani, R; Sanchez, M; Frank, M; Closier, J; Kuznetsov, G

    2004-01-01

    The LHCb grid software has been used for two Physics Data Challenges, the most recent of which will have produced 90 TB of data and required over 400 processor-years of computing power. This paper discusses the group's experience with developing Grid Services, interfacing to the LCG, running LHCb experiment software on the grid, and the integration of a number of new technologies into the LHCb grid software. Our experience and utilisation of the following core technologies will be discussed: OGSI, XML-RPC, grid services, LCG middle-ware, and instant messaging.

  2. Spinal Test Suites for Software Product Lines

    Directory of Open Access Journals (Sweden)

    Harsh Beohar

    2014-03-01

    Full Text Available A major challenge in testing software product lines is efficiency. In particular, testing a product line should take less effort than testing each and every product individually. We address this issue in the context of input-output conformance testing, which is a formal theory of model-based testing. We extend the notion of conformance testing on input-output featured transition systems with the novel concept of spinal test suites. We show how this concept dispenses with retesting the common behavior among different, but similar, products of a software product line.

  3. Gas characterization system software acceptance test report

    International Nuclear Information System (INIS)

    Vo, C.V.

    1996-01-01

    This document details the results of software acceptance testing of gas characterization systems. The gas characterization systems will be used to monitor the vapor spaces of waste tanks known to contain measurable concentrations of flammable gases

  4. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  5. Advances in software science and technology

    CERN Document Server

    Kamimura, Tsutomu

    1994-01-01

    This serial is a translation of the original works within the Japan Society of Software Science and Technology. A key source of information for computer scientists in the U.S., the serial explores the major areas of research in software and technology in Japan. These volumes are intended to promote worldwide exchange of ideas among professionals.This volume includes original research contributions in such areas as Augmented Language Logic (ALL), distributed C language, Smalltalk 80, and TAMPOPO-an evolutionary learning machine based on the principles of Realtime Minimum Skyline Detection.

  6. Evolutionary testing of object-oriented software

    NARCIS (Netherlands)

    Silva, L.S.; van Someren, M.; Shin, D.

    2010-01-01

    It is estimated that 80% of software development cost is spent on detecting and fixing defects. To tackle this issue, a number of tools and testing techniques have been developed to improve the existing testing framework. Although techniques such as static analysis, random testing and evolutionary

  7. Implementation and Testing of VLBI Software Correlation at the USNO

    Science.gov (United States)

    Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken

    2010-01-01

    The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.

  8. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  9. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  10. A new curimeter with advanced software capabilities

    International Nuclear Information System (INIS)

    Arista Romeo, Eduardo J.; Toledo Acosta, Rene B.; Dotres Lleras, Armando

    2001-01-01

    A new curimeter, model CD-N102, developed at CEADEN is described. Emphasis is made on the description of the hardware and basic low level software of the device. Attention is paid to meteorological and quality assurance aspects of the problem, as this device is destined to complete the Nuclear Medicine Modules at different hospitals on the country. The characteristics obtained are mentioned. A full block schema is presented and key blocks are discussed in more detail. Also, there are discussed the utilization of software resources and methods for achieving the aforementioned characteristics without further increasing the complexity of the analog part of the device, and trade off between the different characteristics involved and final decisions taken at the design stage

  11. Absorbing Software Testing into the Scrum Method

    Science.gov (United States)

    Tuomikoski, Janne; Tervonen, Ilkka

    In this paper we study, how to absorb software testing into the Scrum method. We conducted the research as an action research during the years 2007-2008 with three iterations. The result showed that testing can and even should be absorbed to the Scrum method. The testing team was merged into the Scrum teams. The teams can now deliver better working software in a shorter time, because testing keeps track of the progress of the development. Also the team spirit is higher, because the Scrum team members are committed to the same goal. The biggest change from test manager’s point of view was the organized Product Owner Team. Test manager don’t have testing team anymore, and in the future all the testing tasks have to be assigned through the Product Backlog.

  12. Studying the Feasibility and Importance of Software Testing: An Analysis

    OpenAIRE

    Dr.S.S.Riaz Ahamed

    2009-01-01

    Software testing is a critical element of software quality assurance and represents the ultimate review of specification, design and coding. Software testing is the process of testing the functionality and correctness of software by running it. Software testing is usually performed for one of two reasons: defect detection, and reliability estimation. The problem of applying software testing to defect detection is that software can only suggest the presence of flaws, not their absence (unless ...

  13. Studying the Importance and Feasibility of Software Testing

    OpenAIRE

    Anil Kumar Velupula; Dr.Ch GVN Prasad; Shanmukhananda Reddy Kurli; Shiva Kumar Kajjam

    2011-01-01

    Software testing is a critical element of software quality assurance and represents the ultimate review of specification, design and coding. Software testing is the process of testing the functionality and correctness of software by running it. Software testing is usually performed for one of two reasons: defect detection, and reliability estimation. The problem of applying software testing to defect detection is that software can only suggest the presence of flaws, not their absence (unless ...

  14. Dynamic assertion testing of flight control software

    Science.gov (United States)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    An experiment in using assertions to dynamically test fault tolerant flight software is described. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters.

  15. Using Fuzz Testing for Searching Software Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Bogdan Leonidovich Kozirsky

    2014-12-01

    Full Text Available This article deals with fuzz testing (fuzzing, a software testing and vulnerability searching technique based on providing inputs of programs with random data and further analysis of their behavior. The basics of implementing cmdline argument fuzzer, environment variable fuzzer and syscall fuzzer in any UNIX-like OS have been closely investigated.

  16. Software testing using Visual Studio 2012

    CERN Document Server

    Subashni, S

    2013-01-01

    We will be setting up a sample test scenario, then we'll walk through the features available to deploy tests.This book is for developers and testers who want to get to grips with Visual Studio 2012 and Test Manager for all testing activities and managing tests and results in Team Foundation Server. It requires a minimal understanding of testing practices and the software development life cycle; also, some coding skills would help in customizing and updating the code generated from the web UI testing.

  17. Writing executable assertions to test flight software

    Science.gov (United States)

    Mahmood, A.; Andrews, D. M.; Mccluskey, E. J.

    1984-01-01

    An executable assertion is a logical statement about the variables or a block of code. If there is no error during execution, the assertion statement results in a true value. Executable assertions can be used for dynamic testing of software. They can be employed for validation during the design phase, and exception and error detection during the operation phase. The present investigation is concerned with the problem of writing executable assertions, taking into account the use of assertions for testing flight software. They can be employed for validation during the design phase, and for exception handling and error detection during the operation phase The digital flight control system and the flight control software are discussed. The considered system provides autopilot and flight director modes of operation for automatic and manual control of the aircraft during all phases of flight. Attention is given to techniques for writing and using assertions to test flight software, an experimental setup to test flight software, and language features to support efficient use of assertions.

  18. Final Report for "Center for Technology for Advanced Scientific Component Software"

    Energy Technology Data Exchange (ETDEWEB)

    Svetlana Shasharina

    2010-12-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  19. Final Report for 'Center for Technology for Advanced Scientific Component Software'

    International Nuclear Information System (INIS)

    Shasharina, Svetlana

    2010-01-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  20. Open Source Testing Capability for Geospatial Software

    Science.gov (United States)

    Bermudez, L. E.

    2013-12-01

    Geospatial Software enables scientists to discover, access and process information for better understanding of the Earth. Hundreds, if not thousands, of geospatial software packages exist today. Many of these implement open standards. The OGC Implementation Statistics page [1] reports, for example, more than 450 software products that implement the OGC Web Map Service (WMS) 1.1.1 standard. Even though organizations voluntarily report their products as implementing the WMS standard, not all of these implementations can interoperate with each other. For example, a WMS client may not interact with all these WMS servers in the same functional way. Making the software work with other software, even when implementing the same standard, still remains a challenge, and the main reason is that not all implementations implement the standard correctly. The Open Geospatial Consortium (OGC) Compliance Program provides a testing infrastructure to test for the correct implementation of OGC standards in interfaces and encodings that enable communication between geospatial clients and servers. The OGC testing tool and the tests are all freely available, including the source code and access to the testing facility. The Test, Evaluation, And Measurement (TEAM) Engine is a test harness that executes test suites written using the OGC Compliance Testing Language (CTL) or the TestNG framework. TEAM Engine is available in Sourceforge. OGC hosts an official stable [2] deployment of TEAM Engine with the approved test suites. OGC also hosts a Beta TEAM Engine [3] with the tests in Beta and with new TEAM Engine functionality. Both deployments are freely available to everybody. The OGC testing infrastructure not only enables developers to test OGC standards, but it can be configured to test profiles of OGC standards and community-developed application agreements. These agreements can be any interface and encoding agreement, not only OGC based. The OGC Compliance Program is thus an important

  1. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  2. A Framework of the Use of Information in Software Testing

    Science.gov (United States)

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  3. Acceptance Test Plan for ANSYS Software

    International Nuclear Information System (INIS)

    CREA, B.A.

    2000-01-01

    This plan governs the acceptance testing of the ANSYS software (Full Mechanical Release 5.5) for use on Project Word Management Contract (PHMC) computer systems (either UNIX or Microsoft Windows/NT). There are two phases to the acceptance testing covered by this test plan: program execution in accordance with the guidance provided in installation manuals; and ensuring results of the execution are consistent with the expected physical behavior of the system being modeled

  4. Software Testing using Ruby on Rails framework

    OpenAIRE

    Jurglič, Matic

    2014-01-01

    In the world of modern web applications and open source technologies there is a rising in popularity of using test driven development methodologies in software development. The main advantages of writing tests are easier error discovery, more effective development process, and consequently higher product quality. This thesis describes common testing techniques and focuses on usage in Ruby on Rails framework, which has a vibrant open source community with a culture that strongly emphasizes...

  5. Mars Science Laboratory Flight Software Internal Testing

    Science.gov (United States)

    Jones, Justin D.; Lam, Danny

    2011-01-01

    The Mars Science Laboratory (MSL) team is sending the rover, Curiosity, to Mars, and therefore is physically and technically complex. During my stay, I have assisted the MSL Flight Software (FSW) team in implementing functional test scripts to ensure that the FSW performs to the best of its abilities. There are a large number of FSW requirements that have been written up for implementation; however I have only been assigned a few sections of these requirements. There are many stages within testing; one of the early stages is FSW Internal Testing (FIT). The FIT team can accomplish this with simulation software and the MSL Test Automation Kit (MTAK). MTAK has the ability to integrate with the Software Simulation Equipment (SSE) and the Mission Processing and Control System (MPCS) software which makes it a powerful tool within the MSL FSW development process. The MSL team must ensure that the rover accomplishes all stages of the mission successfully. Due to the natural complexity of this project there is a strong emphasis on testing, as failure is not an option. The entire mission could be jeopardized if something is overlooked.

  6. Automation software for a materials testing laboratory

    Science.gov (United States)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1990-01-01

    The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.

  7. Have the Software Testing a Future?

    Directory of Open Access Journals (Sweden)

    Juan A. Godoy

    2012-06-01

    Full Text Available Software testing is directed to a dark future, with greater political isolation management, less funding and poorer overall quality. The hopes of the theory of software quality and test new technologies of the 1990s have been usurped by "tastes" in the development focused on ideas such as "Agile", "Object Oriented", "Cloud” and applications “Mobile” of $ 0.99. The new languages and development methods are designed to allow developers to "throw" code faster and not to improve versions, maintenance, testing and traceability or auditing. The costs of maintenance and development will increase, the budgets for the test will fall and more projects fail. The future of the tests is shade. In this article is analyzed this situation.

  8. Formal Testing of Correspondence Carrying Software

    NARCIS (Netherlands)

    Bujorianu, M.C.; Bujorianu, L.M.; Maharaj, S.

    2008-01-01

    Nowadays formal software development is characterised by use of multitude formal specification languages. Test case generation from formal specifications depends in general on a specific language, and, moreover, there are competing methods for each language. There is a need for a generic approach to

  9. Software for Testing Electroactive Structural Components

    Science.gov (United States)

    Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar

    2003-01-01

    A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.

  10. Software For Uniaxial Mechanical Testing Of Materials

    Science.gov (United States)

    Mcgaw, M. A.; Pech, D. K.

    1995-01-01

    Materials Testing Software system designed to simplify and automate both routine and not-so-routine materials-testing tasks encountered in laboratory. Supports plan/test/analyze cycle through collection of programs, each optimized to specific task. Gives precise control over nature of command waveforms and acquisition of data, including dynamically variable waveform types, sets of data-acquisition channels, and data rates. Differing command and data-acquisition rates required for exploring creep and fatigue material behavior easily accommodated. Written in Modula-2.

  11. Irreducible Tests for Space Mission Sequencing Software

    Science.gov (United States)

    Ferguson, Lisa

    2012-01-01

    As missions extend further into space, the modeling and simulation of their every action and instruction becomes critical. The greater the distance between Earth and the spacecraft, the smaller the window for communication becomes. Therefore, through modeling and simulating the planned operations, the most efficient sequence of commands can be sent to the spacecraft. The Space Mission Sequencing Software is being developed as the next generation of sequencing software to ensure the most efficient communication to interplanetary and deep space mission spacecraft. Aside from efficiency, the software also checks to make sure that communication during a specified time is even possible, meaning that there is not a planet or moon preventing reception of a signal from Earth or that two opposing commands are being given simultaneously. In this way, the software not only models the proposed instructions to the spacecraft, but also validates the commands as well.To ensure that all spacecraft communications are sequenced properly, a timeline is used to structure the data. The created timelines are immutable and once data is as-signed to a timeline, it shall never be deleted nor renamed. This is to prevent the need for storing and filing the timelines for use by other programs. Several types of timelines can be created to accommodate different types of communications (activities, measurements, commands, states, events). Each of these timeline types requires specific parameters and all have options for additional parameters if needed. With so many combinations of parameters available, the robustness and stability of the software is a necessity. Therefore a baseline must be established to ensure the full functionality of the software and it is here where the irreducible tests come into use.

  12. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  13. Imaging Sensor Flight and Test Equipment Software

    Science.gov (United States)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes

  14. Advances in Games Technology: Software, Models, and Intelligence

    Science.gov (United States)

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  15. Automation of Flight Software Regression Testing

    Science.gov (United States)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add

  16. Advanced Superconducting Test Accelerator (ASTA)

    Data.gov (United States)

    Federal Laboratory Consortium — The Advanced Superconducting Test Accelerator (ASTA) facility will be based on upgrades to the existing NML pulsed SRF facility. ASTA is envisioned to contain 3 to 6...

  17. Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0

    Science.gov (United States)

    Wright, Theodore W.

    2016-01-01

    A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.

  18. Integration and Testing of LCS Software

    Science.gov (United States)

    Wang, John

    2014-01-01

    Kennedy Space Center is in the midst of developing a command and control system for the launch of the next generation manned space vehicle. The Space Launch System (SLS) will launch using the new Spaceport Command and Control System (SCCS). As a member of the Software Integration and Test (SWIT) Team, command scripts, and bash scripts were written to assist in integration and testing of the Launch Control System (LCS), which is a component of SCCS. The short term and midterm tasks are for the most part completed. The long term tasks if time permits will require a presentation and demonstration.

  19. ES 1010 software for testing CAMAC modules

    International Nuclear Information System (INIS)

    Ableev, V.G.; Basiladze, S.G.; Zaporozhets, S.A.; Piskunov, N.M.; Ryabtsov, V.D.; Sitnik, I.M.; Strokovskij, E.A.; Sharov, V.I.

    1977-01-01

    Test programs for digital and analog-digital CAMAC modules applied in physical experiments are described. Algorithms were written in FORTRAN-4 language for testing, data acquisition, processing and data control. ASSEMBLER ES 1010 subroutines were used for data acquisition and CAMAC module control. This allowed one to take advantages of a high level language for data processing and display, as well as for achieving an interface with the CAMAC hardware. Software applied enables one to improve considerably adjustment of CAMAC modules and to obtain their operational characteristics

  20. Optimizing infrastructure for software testing using virtualization

    International Nuclear Information System (INIS)

    Khalid, O.; Shaikh, A.; Copy, B.

    2012-01-01

    Virtualization technology and cloud computing have brought a paradigm shift in the way we utilize, deploy and manage computer resources. They allow fast deployment of multiple operating system as containers on physical machines which can be either discarded after use or check-pointed for later re-deployment. At European Organization for Nuclear Research (CERN), we have been using virtualization technology to quickly setup virtual machines for our developers with pre-configured software to enable them to quickly test/deploy a new version of a software patch for a given application. This paper reports both on the techniques that have been used to setup a private cloud on a commodity hardware and also presents the optimization techniques we used to remove deployment specific performance bottlenecks. (authors)

  1. Markov chains for testing redundant software

    Science.gov (United States)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  2. Optimizing Infrastructure for Software Testing Using Virtualization

    CERN Document Server

    Khalid, Omer; Copy, Brice

    2011-01-01

    Virtualization technology and cloud computing have brought a paradigm shift in the way we utilize, deploy and manage computer resources. They allow fast deployment of multiple operating system as containers on physical machines which can be either discarded after use or check- pointed for later re-deployment. At European Organization for Nuclear Research (CERN), we have been using virtualization technology to quickly setup virtual machines for our developers with preconfigured software to enable them to quickly test/deploy a new version of a software patch for a given application. This paper reports both on the techniques that have been used to setup a private cloud on a commodity hardware and also presents the optimization techniques we used to remove deployment specific performance bottlenecks.

  3. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  4. Advanced Transport Operating System (ATOPS) control display unit software description

    Science.gov (United States)

    Slominski, Christopher J.; Parks, Mark A.; Debure, Kelly R.; Heaphy, William J.

    1992-01-01

    The software created for the Control Display Units (CDUs), used for the Advanced Transport Operating Systems (ATOPS) project, on the Transport Systems Research Vehicle (TSRV) is described. Module descriptions are presented in a standardized format which contains module purpose, calling sequence, a detailed description, and global references. The global reference section includes subroutines, functions, and common variables referenced by a particular module. The CDUs, one for the pilot and one for the copilot, are used for flight management purposes. Operations performed with the CDU affects the aircraft's guidance, navigation, and display software.

  5. Lessons Learned in Software Testing A Context-Driven Approach

    CERN Document Server

    Kaner, Cem; Pettichord, Bret

    2008-01-01

    Decades of software testing experience condensed into the most important lessons learned.The world's leading software testing experts lend you their wisdom and years of experience to help you avoid the most common mistakes in testing software. Each lesson is an assertion related to software testing, followed by an explanation or example that shows you the how, when, and why of the testing lesson. More than just tips, tricks, and pitfalls to avoid, Lessons Learned in Software Testing speeds you through the critical testing phase of the software development project without the extensive trial an

  6. 15 CFR 995.27 - Format validation software testing.

    Science.gov (United States)

    2010-01-01

    ... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying... specification. These tests may be combined with testing of the conversion software. ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Format validation software testing...

  7. Advanced Vehicle Testing and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Garetson, Thomas [The Clarity Group, Incorporated, Chicago, IL (United States)

    2013-03-31

    The objective of the United States (U.S.) Department of Energy's (DOEs) Advanced Vehicle Testing and Evaluation (AVTE) project was to provide test and evaluation services for advanced technology vehicles, to establish a performance baseline, to determine vehicle reliability, and to evaluate vehicle operating costs in fleet operations.Vehicles tested include light and medium-duty vehicles in conventional, hybrid, and all-electric configurations using conventional and alternative fuels, including hydrogen in internal combustion engines. Vehicles were tested on closed tracks and chassis dynamometers, as well as operated on public roads, in fleet operations, and over prescribed routes. All testing was controlled by procedures developed specifically to support such testing.

  8. An architectural model for software testing lesson learned systems

    OpenAIRE

    Pazos Sierra, Juan; Andrade, Javier; Ares Casal, Juan M.; Martínez Rey, María Aurora; Rodríguez, Santiago; Romera, Julio; Suárez, Sonia

    2013-01-01

    Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. ...

  9. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  10. Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Damevski, Kostadin [Virginia State Univ., Petersburg, VA (United States)

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  11. Survey of advanced general-purpose software for robot manipulators

    International Nuclear Information System (INIS)

    Latombe, J.C.

    1983-01-01

    Computer-controlled sensor-based robots will more and more common in industry. This paper attempts to survey the main trends of the development of advanced general-purpose software for robot manipulators. It is intended to make clear that robots are not only mechanical devices. They are truly programmable machines, and their programming, which occurs in an imperfectly modelled world,is somewhat different from conventional computer programming. (orig.)

  12. Pragmatic Software Testing Becoming an Effective and Efficient Test Professional

    CERN Document Server

    Black, Rex

    2011-01-01

    A hands-on guide to testing techniques that deliver reliable software and systemsTesting even a simple system can quickly turn into a potentially infinite task. Faced with tight costs and schedules, testers need to have a toolkit of practical techniques combined with hands-on experience and the right strategies in order to complete a successful project. World-renowned testing expert Rex Black provides you with the proven methods and concepts that test professionals must know. He presents you with the fundamental techniques for testing and clearly shows you how to select and apply successful st

  13. The proposal of a novel software testing framework

    OpenAIRE

    Ahmad, Munib; Bajaber, Fuad; Qureshi, M. Rizwan Jameel

    2014-01-01

    Software testing is normally used to check the validity of a program. Test oracle performs an important role in software testing. The focus in this research is to perform class level test by introducing a testing framework. A technique is developed to generate test oracle for specification-based software testing using Vienna Development Method (VDM++) formal language. A three stage translation process, of VDM++ specifications of container classes to C++ test oracle classes, is described in th...

  14. Injecting Errors for Testing Built-In Test Software

    Science.gov (United States)

    Gender, Thomas K.; Chow, James

    2010-01-01

    Two algorithms have been conceived to enable automated, thorough testing of Built-in test (BIT) software. The first algorithm applies to BIT routines that define pass/fail criteria based on values of data read from such hardware devices as memories, input ports, or registers. This algorithm simulates effects of errors in a device under test by (1) intercepting data from the device and (2) performing AND operations between the data and the data mask specific to the device. This operation yields values not expected by the BIT routine. This algorithm entails very small, permanent instrumentation of the software under test (SUT) for performing the AND operations. The second algorithm applies to BIT programs that provide services to users application programs via commands or callable interfaces and requires a capability for test-driver software to read and write the memory used in execution of the SUT. This algorithm identifies all SUT code execution addresses where errors are to be injected, then temporarily replaces the code at those addresses with small test code sequences to inject latent severe errors, then determines whether, as desired, the SUT detects the errors and recovers

  15. Florida alternative NTCIP testing software (ANTS) for actuated signal controllers.

    Science.gov (United States)

    2009-01-01

    The scope of this research project did include the development of a software tool to test devices for NTCIP compliance. Development of the Florida Alternative NTCIP Testing Software (ANTS) was developed by the research team due to limitations found w...

  16. Tests of Event Filter Configuration Software

    CERN Multimedia

    Wickens, F.J.

    TDAQ - Tests of Event Filter configuration software Within Trigger/DAQ a major consideration is how well the performance of the system components scale in going from the small set-ups used for development work to the final system with many hundreds of processors and links. In the case of the Event Filter, which makes the final stage of on-line event selection, plus on-line calibrations and monitoring, more than a thousand processors are envisaged. These processors will be divided into sub-farms, most will be remote from the detector and some could even be at institutes far from CERN. As part of the on-line system it is important that the software in the sub-farms can be reconfigured rapidly as runs start and stop, and that the system be fault tolerant. The flow of data inside a sub-farm involves many processes, for distribution and collection of results in addition to those for event processing itself. Supervision code written in Java has been developed to manage the processes within a cluster, with XML f...

  17. Improving Single Event Effects Testing Through Software

    Science.gov (United States)

    Banker, M. W.

    2011-01-01

    Radiation encountered in space environments can be damaging to microelectronics and potentially cause spacecraft failure. Single event effects (SEE) are a type of radiation effect that occur when an ion strikes a device. Single event gate rupture (SEGR) is a type of SEE that can cause failure in power transistors. Unlike other SEE rates in which a constant linear energy transfer (LET) can be used, SEGR rates sometimes require a non-uniform LET to be used to be accurate. A recent analysis shows that SEGR rates are most easily calculated when the environment is described as a stopping rate per unit volume for each ion species. Stopping rates in silicon for pertinent ions were calculated using the Stopping and Range of Ions in Matter (SRIM) software and CREME-MC software. A reference table was generated and can be used by others to calculate SEGR rates for a candidate device. Additionally, lasers can be used to simulate SEEs, providing more control and information at lower cost than heavy ion testing. The electron/hole pair generation rate from a laser pulse in a semiconductor can be related to the LET of an ion. MATLAB was used to generate a plot to easily make this comparison.

  18. Software testing and global industry future paradigms

    CERN Document Server

    Casey, Valentine; Richardson, Ita

    2009-01-01

    Today software development has truly become a globally sourced commodity. This trend has been facilitated by the availability of highly skilled software professionals in low cost locations in Eastern Europe, Latin America and the Far East. Organisations

  19. A methodology for testing fault-tolerant software

    Science.gov (United States)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.

  20. Advanced software tools for digital loose part monitoring systems

    International Nuclear Information System (INIS)

    Ding, Y.

    1996-01-01

    The paper describes two software modules as analysis tools for digital loose part monitoring systems. The first module is called acoustic module which utilizes the multi-media features of modern personal computers to replay the digital stored short-time bursts with sufficient length and in good quality. This is possible due to the so-called puzzle technique developed at ISTec. The second module is called classification module which calculates advanced burst parameters and classifies the acoustic events in pre-defined classes with the help of an artificial multi-layer perception neural network trained with the back propagation algorithm. (author). 7 refs, 7 figs

  1. A high order approach to flight software development and testing

    Science.gov (United States)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  2. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  3. Software engineers and nuclear engineers: teaming up to do testing

    International Nuclear Information System (INIS)

    Kelly, D.; Cote, N.; Shepard, T.

    2007-01-01

    The software engineering community has traditionally paid little attention to the specific needs of engineers and scientists who develop their own software. Recently there has been increased recognition that specific software engineering techniques need to be found for this group of developers. In this case study, a software engineering group teamed with a nuclear engineering group to develop a software testing strategy. This work examines the types of testing that proved to be useful and examines what each discipline brings to the table to improve the quality of the software product. (author)

  4. Integrated testing and verification system for research flight software

    Science.gov (United States)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  5. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  6. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  7. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  8. ALICES: an advanced object-oriented software workshop for simulators

    International Nuclear Information System (INIS)

    Sayet, R.L.; Rouault, G.; Pieroux, D.; Houte, U. Van

    1999-01-01

    Reducing simulator development costs while improving model quality, user-friendliness and teaching capabilities, is a major target for many years in the simulation industry. It has led to the development of specific software tools which have been improved progressively following the new features and capabilities offered by the software industry. Unlike most of these software tools, ALICES (which is a French acronym for 'Interactive Software Workshop for the Design of Simulators') is not an upgrade of a previous generation of tools, like putting a graphical front-end to a classical code generator, but a really new development. Its design specification is based on previous experience with different tools as well as on new capabilities of software technology, mainly in Object Oriented Design. This allowed us to make a real technological 'jump' in the simulation industry, beyond the constraints of some traditional approaches. The main objectives behind the development of ALICES were the following: (1) Minimizing the simulator development time and costs: a simulator development consists mainly in developing software. One way to reduce costs is to facilitate reuse of existing software by developing standard components, and by defining interface standards, (2) Insuring that the produced simulator can be maintained and updated at a minimal cost: a simulator must evolve along with the simulated process, and it is then necessary to update periodically the simulator. The cost of an adequate maintenance is highly dependent of the quality of the software workshop, (3) Covering the whole simulator development process: from the data package to the acceptance tests and for maintenance and upgrade activities; with the whole development team, even if it is dispatched at different working sites; respecting the Quality Assurance rules and procedures (CORYS T.E.S.S. and TRACTEBEL are ISO-9001 certified). The development of ALICES was also done to comply with the following two main

  9. Technique for unit testing of safety software verification and validation

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2008-01-01

    The key issue arising from digitalization of the reactor protection system for nuclear power plant is how to carry out verification and validation (V and V), to demonstrate and confirm the software that performs reactor safety functions is safe and reliable. One of the most important processes for software V and V is unit testing, which verifies and validates the software coding based on concept design for consistency, correctness and completeness during software development. The paper shows a preliminary study on the technique for unit testing of safety software V and V, focusing on such aspects as how to confirm test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed here was successfully used in the work of unit testing on safety software of a digital reactor protection system. (authors)

  10. Developing a TTCN-3 Test Harness for Legacy Software

    DEFF Research Database (Denmark)

    Okika, Joseph C.; Ravn, Anders Peter; Siddalingaiah, Lokesh

    2006-01-01

    challenge in developing the test harness is to interface a generic test driver to the legacy software and provide a suitable interface for test engineers. The main contribution of this paper is a demonstration of a suitable design for such a test harness. It includes: a TTCN-3 test driver in C++, the legacy...... control software in C, a Graphical User Interface (GUI) and the connectors in Java. Our experience shows that it is feasible to use TTCN-3 in developing a test harness for a legacy software for an embedded system, even when it involves different heterogeneous components.......We describe a prototype test harness for an embedded system which is the control software for a modern marine diesel engine. The operations of such control software requires complete certification. We adopt Testing and Test Control Notation (TTCN-3) to define test cases for this purpose. The main...

  11. Developing a TTCN-3 Test Harness for Legacy Software

    DEFF Research Database (Denmark)

    Okika, Joseph C.; Ravn, Anders Peter; Siddalingaiah, Lokesh

    2006-01-01

    We describe a prototype test harness for an embedded system which is the control software for a modern marine diesel engine. The operations of such control software requires complete certification. We adopt Testing and Test Control Notation (TTCN-3) to define test cases for this purpose. The main...... challenge in developing the test harness is to interface a generic test driver to the legacy software and provide a suitable interface for test engineers. The main contribution of this paper is a demonstration of a suitable design for such a test harness. It includes: a TTCN-3 test driver in C++, the legacy...... control software in C, a Graphical User Interface (GUI) and the connectors in Java. Our experience shows that it is feasible to use TTCN-3 in developing a test harness for a legacy software for an embedded system, even when it involves different heterogeneous components....

  12. Methods and characteristics of assembly language software testing

    International Nuclear Information System (INIS)

    Wang Lingfang

    2001-01-01

    Single chip micro-controllers are widely implemented to the controlling and testing products in industrial controlling and national defence embedded controlling systems. The invalidation of the source programs could lead to the unreliability of the whole systems, even to cause fatal results. Therefore, software testing is the necessary measures to reduce the mistakes and to improve the quality of the software. In the paper, the development of the software testing is presented. The distinctions between the assembly language testing and those of the high level languages is introduced. And the essential flow and methods of software testing are discussed in detail

  13. A Method of Partly Automated Testing of Software

    Science.gov (United States)

    Lowry, Mike; Visser, Willem; Washington, Rich; Artho, Cyrille; Goldberg, Allen; Haveland, Klaus; Pasareanu, Corina; Khurshid, Sarfraz; Roflu, Grigore

    2007-01-01

    A method of automated testing of software has been developed that provides an alternative to the conventional mostly manual approach for software testing. The method combines (1) automated generation of test cases on the basis of systematic exploration of the input domain of the software to be tested with (2) run-time analysis in which execution traces are monitored, verified against temporal-logic specifications, and analyzed by concurrency-error-detection algorithms. In this new method, the user only needs to provide the temporal logic specifications against which the software will be tested and the abstract description of the input domain.

  14. Overview of software development at the parabolic dish test site

    Science.gov (United States)

    Miyazono, C. K.

    1985-01-01

    The development history of the data acquisition and data analysis software is discussed. The software development occurred between 1978 and 1984 in support of solar energy module testing at the Jet Propulsion Laboratory's Parabolic Dish Test Site, located within Edwards Test Station. The development went through incremental stages, starting with a simple single-user BASIC set of programs, and progressing to the relative complex multi-user FORTRAN system that was used until the termination of the project. Additional software in support of testing is discussed including software in support of a meteorological subsystem and the Test Bed Concentrator Control Console interface. Conclusions and recommendations for further development are discussed.

  15. Software test attacks to break mobile and embedded devices

    CERN Document Server

    Hagar, Jon Duncan

    2013-01-01

    Address Errors before Users Find Them Using a mix-and-match approach, Software Test Attacks to Break Mobile and Embedded Devices presents an attack basis for testing mobile and embedded systems. Designed for testers working in the ever-expanding world of ""smart"" devices driven by software, the book focuses on attack-based testing that can be used by individuals and teams. The numerous test attacks show you when a software product does not work (i.e., has bugs) and provide you with information about the software product under test. The book guides you step by step starting with the basics. It

  16. Testing Software Review: MicroCAT Version 3.0.

    Science.gov (United States)

    Stone, Clement A.

    1989-01-01

    MicroCAT version 3.0--an integrated test development, administration, and analysis system--is reviewed in this first article of a series on testing software. A framework for comparing testing software is presented. The strength of this package lies in the development, banking, and administration of items composed of text and graphics. (SLD)

  17. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  18. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  19. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  20. Development and Testing of "Math Insight" Software

    Science.gov (United States)

    Zucker, Andrew A.

    2006-01-01

    Computers running appropriate software hold great promise for teaching and learning mathematics. To this end, SRI International developed an integrated, computer-based problem solving environment called "Math Insight" that included interactive tools, such as a spreadsheet and dynamic geometric sketches, and professionally produced videos used to…

  1. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    OpenAIRE

    Jump, David

    2014-01-01

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing...

  2. NEPTUNE: A new software platform for advanced nuclear thermal hydraulics

    International Nuclear Information System (INIS)

    Guelfi, A.; Boucker, M.; Herard, J.M.; Peturaud, P.; Bestion, D.; Boudier, P.; Hervieu, E.; Fillion, P.; Grandotto, M.

    2007-01-01

    The NEPTUNE project constitutes the thermal-hydraulic part of the long-term Electricite de France and Commissariat a l'Energie Atomique joint research and development program for the next generation of nuclear reactor simulation tools. This program is also financially supported by the Institut de Radioprotection et Surete Nucleaire and AREVA NP. The project aims at developing a new software platform for advanced two-phase flow thermal hydraulics covering the whole range of modeling scales and allowing easy multi-scale and multidisciplinary calculations. NEPTUNE is a fully integrated project that covers the following fields: software development, research in physical modeling and numerical methods, development of advanced instrumentation techniques, and performance of new experimental programs. The analysis of the industrial needs points out that three main simulation scales are involved. The system scale is dedicated to the overall description of the reactor. The component or subchannel scale allows three-dimensional computations of the main components of the reactors: cores, steam generators, condensers, and heat exchangers. The current generation of system and component codes has reached a very high level of maturity for industrial applications. The third scale, computational fluid dynamics (CFD) in open medium, allows one to go beyond the limits of the component scale for a finer description of the flows. This scale opens promising perspectives for industrial simulations, and the development and validation of the NEPTUNE CFD module have been a priority since the beginning of the project. It is based on advanced physical models (two-fluid or multi field model combined with interfacial area transport and two-phase turbulence) and modern numerical methods (fully unstructured finite volume solvers). For the system and component scales, prototype developments have also started, including new physical models and numerical methods. In addition to scale

  3. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  4. Markov Chains For Testing Redundant Software

    Science.gov (United States)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  5. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data

    Directory of Open Access Journals (Sweden)

    Robert Oostenveld

    2011-01-01

    Full Text Available This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.

  6. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    A number of software reliability models have been developed to estimate and to predict software reliability. However, there are no established standard models to quantify software reliability. Most models estimate the quality of software in reliability figures such as remaining faults, failure rate, or mean time to next failure at the testing phase, and they consider them ultimate indicators of software reliability. Experience shows that there is a large gap between predicted reliability during development and reliability measured during operation, which means that predicted reliability, or so-called test reliability, is not operational reliability. Customers prefer operational reliability to test reliability. In this study, we propose a method that predicts operational reliability rather than test reliability by introducing the testing environment factor that quantifies the changes in environments

  7. TMACS test procedure TP012: Panalarm software bridge

    International Nuclear Information System (INIS)

    Washburn, S.J.

    1994-01-01

    This Test Procedure addresses the testing of the functionality of the Tank Monitor and Control System (TMACS) Panalarm bridge software. The features to be tested are: Bridge Initialization Options; Bridge Communication; Bridge Performance; Testing Checksum Errors; and Testing Command Reject Errors. Only the first three could be tested; the last two have been deferred to a later date

  8. Recent trends on Software Verification and Validation Testing

    International Nuclear Information System (INIS)

    Kim, Hyungtae; Jeong, Choongheui

    2013-01-01

    Verification and Validation (V and V) include the analysis, evaluation, review, inspection, assessment, and testing of products. Especially testing is an important method to verify and validate software. Software V and V testing covers test planning to execution. IEEE Std. 1012 is a standard on the software V and V. Recently, IEEE Std. 1012-2012 was published. This standard is a major revision to IEEE Std. 1012-2004 which defines only software V and V. It expands the scope of the V and V processes to include system and hardware as well as software. This standard describes the scope of V and V testing according to integrity level. In addition, independent V and V requirement related to software V and V testing in IEEE 7-4.3.2-2010 have been revised. This paper provides a recent trend of software V and V testing by reviewing of IEEE Std. 1012-2012 and IEEE 7-4.3.2-2010. There are no major changes of software V and V testing activities and tasks in IEEE 1012-2012 compared with IEEE 1012-2004. But the positions on the responsibility to perform software V and V testing are changed. In addition IEEE 7-4.3.2-2010 newly describes the positions on responsibility to perform Software V and V Testing. However, the positions of these standards on the V and V testing are different. For integrity level 3 and 4, IEEE 1012-2012 basically requires that V and V organization shall conduct all of V and V testing tasks such as test plan, test design, test case, and test procedure except test execution. If V and V testing is conducted by not V and V but another organization, the results of that testing shall be analyzed by the V and V organization. For safety-related software, IEEE 7-4.3.2-2010 requires that test procedures and reports shall be independently verified by the alternate organization regardless of who writes the procedures and/or conducts the tests

  9. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  10. Ground test accelerator control system software

    International Nuclear Information System (INIS)

    Burczyk, L.; Dalesio, R.; Dingler, R.; Hill, J.; Howell, J.A.; Kerstiens, D.; King, R.; Kozubal, A.; Little, C.; Martz, V.; Rothrock, R.; Sutton, J.

    1988-01-01

    This paper reports on the GTA control system that provides an environment in which the automation of a state-of-the-art accelerator can be developed. It makes use of commercially available computers, workstations, computer networks, industrial 110 equipment, and software. This system has built-in supervisory control (like most accelerator control systems), tools to support continuous control (like the process control industry), and sequential control for automatic start-up and fault recovery (like few other accelerator control systems). Several software tools support these levels of control: a real-time operating system (VxWorks) with a real-time kernel (VRTX), a configuration database, a sequencer, and a graphics editor. VxWorks supports multitasking, fast context-switching, and preemptive scheduling. VxWorks/VRTX is a network-based development environment specifically designed to work in partnership with the UNIX operating system. A data base provides the interface to the accelerator components. It consists of a run time library and a database configuration and editing tool. A sequencer initiates and controls the operation of all sequence programs (expressed as state programs). A graphics editor gives the user the ability to create color graphic displays showing the state of the machine in either text or graphics form

  11. Flight test of a resident backup software system

    Science.gov (United States)

    Deets, Dwain A.; Lock, Wilton P.; Megna, Vincent A.

    1987-01-01

    A new fault-tolerant system software concept employing the primary digital computers as host for the backup software portion has been implemented and flight tested in the F-8 digital fly-by-wire airplane. The system was implemented in such a way that essentially no transients occurred in transferring from primary to backup software. This was accomplished without a significant increase in the complexity of the backup software. The primary digital system was frame synchronized, which provided several advantages in implementing the resident backup software system. Since the time of the flight tests, two other flight vehicle programs have made a commitment to incorporate resident backup software similar in nature to the system described here.

  12. Software OT&E Guidelines. Volume 1. Software Test Manager’s Handbook

    Science.gov (United States)

    1981-02-01

    establishing good working relationship is critical). e) Work with the TEBC advanced planner to ensure early involvement of software data " thinking " (in the OT...Threat Enviromental Description (TED). (The implementing command or AFSC) will update or develop the TED and obtain AF/IN approval by (one year from

  13. Integrating software into PRA: a test-based approach.

    Science.gov (United States)

    Li, Bin; Li, Ming; Smidts, Carol

    2005-08-01

    Probabilistic risk assessment (PRA) is a methodology to assess the probability of failure or success of a system's operation. PRA has been proved to be a systematic, logical, and comprehensive technique for risk assessment. Software plays an increasing role in modern safety critical systems. A significant number of failures can be attributed to software failures. Unfortunately, current probabilistic risk assessment concentrates on representing the behavior of hardware systems, humans, and their contributions (to a limited extent) to risk but neglects the contributions of software due to a lack of understanding of software failure phenomena. It is thus imperative to consider and model the impact of software to reflect the risk in current and future systems. The objective of our research is to develop a methodology to account for the impact of software on system failure that can be used in the classical PRA analysis process. A test-based approach for integrating software into PRA is discussed in this article. This approach includes identification of software functions to be modeled in the PRA, modeling of the software contributions in the ESD, and fault tree. The approach also introduces the concepts of input tree and output tree and proposes a quantification strategy that uses a software safety testing technique. The method is applied to an example system, PACS.

  14. Using Automation to Improve the Flight Software Testing Process

    Science.gov (United States)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  15. Designing a Software Test Automation Framework

    Directory of Open Access Journals (Sweden)

    Sabina AMARICAI

    2014-01-01

    Full Text Available Testing is an art and science that should ultimately lead to lower cost businesses through increasing control and reducing risk. Testing specialists should thoroughly understand the system or application from both the technical and the business perspective, and then design, build and implement the minimum-cost, maximum-coverage validation framework. Test Automation is an important ingredient for testing large scale applications. In this paper we discuss several test automation frameworks, their advantages and disadvantages. We also propose a custom automation framework model that is suited for applications with very complex business requirements and numerous interfaces.

  16. Accuracy Test of Software Architecture Compliance Checking Tools : Test Instruction

    NARCIS (Netherlands)

    Prof.dr. S. Brinkkemper; Dr. Leo Pruijt; C. Köppe; J.M.E.M. van der Werf

    2015-01-01

    Author supplied: "Abstract Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies

  17. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  18. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer... Standard for Software Unit Testing'' with the clarifications and exceptions stated in Section C, ``Staff...

  19. Testing digital safety system software with a testability measure based on a software fault tree

    International Nuclear Information System (INIS)

    Sohn, Se Do; Hyun Seong, Poong

    2006-01-01

    Using predeveloped software, a digital safety system is designed that meets the quality standards of a safety system. To demonstrate the quality, the design process and operating history of the product are reviewed along with configuration management practices. The application software of the safety system is developed in accordance with the planned life cycle. Testing, which is a major phase that takes a significant time in the overall life cycle, can be optimized if the testability of the software can be evaluated. The proposed testability measure of the software is based on the entropy of the importance of basic statements and the failure probability from a software fault tree. To calculate testability, a fault tree is used in the analysis of a source code. With a quantitative measure of testability, testing can be optimized. The proposed testability can also be used to demonstrate whether the test cases based on uniform partitions, such as branch coverage criteria, result in homogeneous partitions that is known to be more effective than random testing. In this paper, the testability measure is calculated for the modules of a nuclear power plant's safety software. The module testing with branch coverage criteria required fewer test cases if the module has higher testability. The result shows that the testability measure can be used to evaluate whether partitions have homogeneous characteristics

  20. Acceptance test report MICON software exhaust fan control

    International Nuclear Information System (INIS)

    Keck, R.D.

    1998-01-01

    This test procedure specifies instructions for acceptance testing of software for exhaust fan control under Project ESPT (Energy Savings Performance Contract). The software controls the operation of two emergency exhaust fans when there is a power failure. This report details the results of acceptance testing for the MICON software upgrades. One of the modifications is that only one of the emergency fans will operate at all times. If the operating fan shuts off or fails, the other fan will start and the operating fan will be stopped

  1. Stress testing of digital flight-control system software

    Science.gov (United States)

    Rajan, N.; Defeo, P. V.; Saito, J.

    1983-01-01

    A technique for dynamically testing digital flight-control system software on a module-by-module basis is described. Each test module is repetitively executed faster than real-time with an exhaustive input sequence. Outputs of the test module are compared with outputs generated by an alternate, simpler implementation for the same input data. Discrepancies between the two sets of output indicate the possible presence of a software error. The results of an implementation of this technique in the Digital Flight-Control System Software Verification Laboratory are discussed.

  2. Field Testing Selected Micro Computer Software.

    Science.gov (United States)

    ECO Northwest, Ltd., Helena, MT.

    As part of a study to test the feasibility of expanding computer use within the Montana business community, computer systems were field tested in four Montana small businesses. The four businesses were a newspaper, an advertising agency, a sheep and cattle ranch, and a private investment company. The companies employed from 3 to 20 persons. Three…

  3. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  5. Software Estimates Costs of Testing Rocket Engines

    Science.gov (United States)

    Smith, C. L.

    2003-01-01

    Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  6. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  7. Digital tape unit test facility software

    Science.gov (United States)

    Jackson, J. T.

    1971-01-01

    Two computer programs are described which are used for the collection and analysis of data from the digital tape unit test facility (DTUTF). The data are the recorded results of skew tests made on magnetic digital tapes which are used on computers as input/output media. The results of each tape test are keypunched onto an 80 column computer card. The format of the card is checked and the card image is stored on a master summary tape via the DTUTF card checking and tape updating system. The master summary tape containing the results of all the tape tests is then used for analysis as input to the DTUTF histogram generating system which produces a histogram of skew vs. date for selected data, followed by some statistical analysis of the data.

  8. Developing and Testing a Video Tutorial for Software Training

    NARCIS (Netherlands)

    van der Meij, Hans

    2014-01-01

    Purpose: Video tutorials for software training are rapidly becoming popular. A set of dedicated guidelines for the construction of such tutorials was recently advanced in Technical Communication (Van der Meij & Van der Meij, 2013). The present study set out to assess the cognitive and motivational

  9. A study on design and testing of software module of safety software

    International Nuclear Information System (INIS)

    Sohn, Se Do; Seong, Poong Hyun

    2000-01-01

    The design criteria of the software module were based on complexity of the module and the cohesion of the module. The easiness of detection of a fault in the software module can be an additional candidate for the module design criteria. The module test coverage criteria and test case generation is reviewed from the aspects of module testability, easiness of the fault detection. One of the methods is making the numerical results as output in addition to the logical outputs. With modules designed with high testability, the test case generation and test coverage can be made more effective

  10. Software Testing During Post-Deployment Support of Weapon Systems

    National Research Council Canada - National Science Library

    Gimble, Thomas

    1994-01-01

    We are providing this final audit report for your information and use. The report discusses policies, procedures, and methodologies for software testing during post-deployment support of weapon systems...

  11. Creating a simulation model of software testing using Simulink package

    Directory of Open Access Journals (Sweden)

    V. M. Dubovoi

    2016-12-01

    Full Text Available The determination of the solution model of software testing that allows prediction both the whole process and its specific stages is actual for IT-industry. The article focuses on solving this problem. The aim of the article is prediction the time and improvement the quality of software testing. The analysis of the software testing process shows that it can be attributed to the branched cyclic technological processes because it is cyclical with decision-making on control operations. The investigation uses authors' previous works andsoftware testing process method based on Markov model. The proposed method enables execution the prediction for each software module, which leads to better decision-making of each controlled suboperation of all processes. Simulink simulation model shows implementation and verification of results of proposed technique. Results of the research have practically implemented in the IT-industry.

  12. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  13. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data

  14. Leveraging the wisdom of the crowd in software testing

    CERN Document Server

    Sharma, Mukesh

    2015-01-01

    Its scale, flexibility, cost effectiveness, and fast turnaround are just a few reasons why crowdsourced testing has received so much attention lately. While there are a few online resources that explain what crowdsourced testing is all about, there's been a need for a book that covers best practices, case studies, and the future of this technique.Filling this need, Leveraging the Wisdom of the Crowd in Software Testing shows you how to leverage the wisdom of the crowd in your software testing process. Its comprehensive coverage includes the history of crowdsourcing and crowdsourced testing, im

  15. A review on software testing approaches for cloud applications

    Directory of Open Access Journals (Sweden)

    Tamanna Siddiqui

    2016-09-01

    Full Text Available Cloud computing has actually been invented to be the latest computing standard that will work several distinctive research areas, such as software testing. Testing cloud applications will keep its unique characteristics that involve more recent testing techniques. Software testing helps to reduce the need for hardware and software services and also provide adaptable and valuable cloud platform. Testing within the cloud platform is easily manageable based on new test models and criteria. Prioritization approach is made responsive to build much better relationship between test cases. These test cases are clustered dependent on priority level. The resources can be used properly by applying load balancing algorithm. Cloud guarantees maximum usage of existing resources. But, security defined as a primary problem in cloud. At the present time, organizations are progressively moving excited about deploying and making use of ready-prepared business applications, with particular short-term to the marketplace. The possible lack of capital budgets for software planning and on principle deployments, along with the swift progression of cloud these are the reasons why one should make the interest on business application. However, these are the interests that help make the SaaS based business application on-demand. In this paper different approaches has been discussed that will help to extend the cloud environment. Also, the study of several well-known software testing approaches.

  16. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  17. Advancing in software product management: An incremental method engineering approach

    NARCIS (Netherlands)

    van de Weerd, I.

    2009-01-01

    Hardly any figures exist on the success of product software companies. What we do know is that a good Software Product Management (SPM) practice pays off. However, not many IT-professionals know how to implement SPM practices in their organization, which causes many companies to not have the proper

  18. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    Science.gov (United States)

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  19. IHE cross-enterprise document sharing for imaging: interoperability testing software

    Directory of Open Access Journals (Sweden)

    Renaud Bérubé

    2010-09-01

    Full Text Available Abstract Background With the deployments of Electronic Health Records (EHR, interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  20. Software testing an ISTQB-BCS certified tester foundation guide

    CERN Document Server

    Hambling, Brian; Samaroo, Angelina; Thompson, Geoff; Williams, Peter; Hambling, Brian

    2015-01-01

    This practical guide provides insight into software testing, explaining the basics of the testing process and how to perform effective tests. It provides an overview of different techniques and how to apply them. It is the best-selling official textbook of the ISTQB-BCS Certified Tester Foundation Level.

  1. Domain-specific functional software testing: A progress report

    Science.gov (United States)

    Nonnenmann, Uwe

    1992-01-01

    Software Engineering is a knowledge intensive activity that involves defining, designing, developing, and maintaining software systems. In order to build effective systems to support Software Engineering activities, Artificial Intelligence techniques are needed. The application of Artificial Intelligence technology to Software Engineering is called Knowledge-based Software Engineering (KBSE). The goal of KBSE is to change the software life cycle such that software maintenance and evolution occur by modifying the specifications and then rederiving the implementation rather than by directly modifying the implementation. The use of domain knowledge in developing KBSE systems is crucial. Our work is mainly related to one area of KBSE that is called automatic specification acquisition. One example is the WATSON prototype on which our current work is based. WATSON is an automatic programming system for formalizing specifications for telephone switching software mainly restricted to POTS, i.e., plain old telephone service. Our current approach differentiates itself from other approaches in two antagonistic ways. On the one hand, we address a large and complex real-world problem instead of a 'toy domain' as in many research prototypes. On the other hand, to allow such scaling, we had to relax the ambitious goal of complete automatic programming, to the easier task of automatic testing.

  2. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    Science.gov (United States)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  3. Performance testing of 3D point cloud software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  4. Factors to Consider When Implementing Automated Software Testing

    Science.gov (United States)

    2016-11-10

    Therefore, many businesses are automating their software testing in order to save money and improve quality. When considering whether automation...is a viable option, businesses must take several factors into account. The purpose of this document is to illuminate these factors. Software...programming, e.g., Java or Visual Basic.  Subject Matter Experts (SME) with firm grasp of application being automated. 2. Additional costs for setup (e.g

  5. Small-scale fixed wing airplane software verification flight test

    Science.gov (United States)

    Miller, Natasha R.

    The increased demand for micro Unmanned Air Vehicles (UAV) driven by military requirements, commercial use, and academia is creating a need for the ability to quickly and accurately conduct low Reynolds Number aircraft design. There exist several open source software programs that are free or inexpensive that can be used for large scale aircraft design, but few software programs target the realm of low Reynolds Number flight. XFLR5 is an open source, free to download, software program that attempts to take into consideration viscous effects that occur at low Reynolds Number in airfoil design, 3D wing design, and 3D airplane design. An off the shelf, remote control airplane was used as a test bed to model in XFLR5 and then compared to flight test collected data. Flight test focused on the stability modes of the 3D plane, specifically the phugoid mode. Design and execution of the flight tests were accomplished for the RC airplane using methodology from full scale military airplane test procedures. Results from flight test were not conclusive in determining the accuracy of the XFLR5 software program. There were several sources of uncertainty that did not allow for a full analysis of the flight test results. An off the shelf drone autopilot was used as a data collection device for flight testing. The precision and accuracy of the autopilot is unknown. Potential future work should investigate flight test methods for small scale UAV flight.

  6. Software for Automated Testing of Mission-Control Displays

    Science.gov (United States)

    OHagan, Brian

    2004-01-01

    MCC Display Cert Tool is a set of software tools for automated testing of computerterminal displays in spacecraft mission-control centers, including those of the space shuttle and the International Space Station. This software makes it possible to perform tests that are more thorough, take less time, and are less likely to lead to erroneous results, relative to tests performed manually. This software enables comparison of two sets of displays to report command and telemetry differences, generates test scripts for verifying telemetry and commands, and generates a documentary record containing display information, including version and corrective-maintenance data. At the time of reporting the information for this article, work was continuing to add a capability for validation of display parameters against a reconfiguration file.

  7. System Quality Management in Software Testing Laboratory that Chooses Accreditation

    Directory of Open Access Journals (Sweden)

    Yanet Brito R.

    2013-12-01

    Full Text Available The evaluation of software products will reach full maturity when executed by the scheme and provides third party certification. For the validity of the certification, the independent laboratory must be accredited for that function, using internationally recognized standards. This brings with it a challenge for the Industrial Laboratory Testing Software (LIPS, responsible for testing the products developed in Cuban Software Industry, define strategies that will permit it to offer services with a high level of quality. Therefore it is necessary to establish a system of quality management according to NC-ISO/IEC 17025: 2006 to continuously improve the operational capacity and technical competence of the laboratory, with a view to future accreditation of tests performed. This article discusses the process defined in the LIPS for the implementation of a Management System of Quality, from the current standards and trends, as a necessary step to opt for the accreditation of the tests performed.

  8. Parallel symbolic execution for automated real-world software testing

    OpenAIRE

    Bucur, Stefan; Ureche, Vlad; Zamfir, Cristian; Candea, George

    2011-01-01

    This paper introduces Cloud9, a platform for automated testing of real-world software. Our main contribution is the scalable parallelization of symbolic execution on clusters of commodity hardware, to help cope with path explosion. Cloud9 provides a systematic interface for writing "symbolic tests" that concisely specify entire families of inputs and behaviors to be tested, thus improving testing productivity. Cloud9 can handle not only single-threaded programs but also multi-threaded and dis...

  9. Using Knowledge Management to Revise Software-Testing Processes

    Science.gov (United States)

    Nogeste, Kersti; Walker, Derek H. T.

    2006-01-01

    Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…

  10. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  11. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  12. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  13. Improved ant algorithms for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Man, Tianlong; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to produce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations.

  14. Subsystem software for TSTA [Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    Mann, L.W.; Claborn, G.W.; Nielson, C.W.

    1987-01-01

    The Subsystem Control Software at the Tritium System Test Assembly (TSTA) must control sophisticated chemical processes through the physical operation of valves, motor controllers, gas sampling devices, thermocouples, pressure transducers, and similar devices. Such control software has to be capable of passing stringent quality assurance (QA) criteria to provide for the safe handling of significant amounts of tritium on a routine basis. Since many of the chemical processes and physical components are experimental, the control software has to be flexible enough to allow for trial/error learning curve, but still protect the environment and personnel from exposure to unsafe levels of radiation. The software at TSTA is implemented in several levels as described in a preceding paper in these proceedings. This paper depends on information given in the preceding paper for understanding. The top level is the Subsystem Control level

  15. Advanced Expander Test Bed Program

    Science.gov (United States)

    1991-04-01

    Aoceptance Tests 2. Design Methodology Review a. Cononent Acceptance Tm w/ Spares 3. Preliminary DeignReview 9. Engine Asembly and Acceptance Teets 4. Critical...of the disk and the bearing of both primary and secondary turbines, has been revised to accommodate brush seals for reduced leakage. Primary disk

  16. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and

  17. Potential Errors and Test Assessment in Software Product Line Engineering

    Directory of Open Access Journals (Sweden)

    Hartmut Lackner

    2015-04-01

    Full Text Available Software product lines (SPL are a method for the development of variant-rich software systems. Compared to non-variable systems, testing SPLs is extensive due to an increasingly amount of possible products. Different approaches exist for testing SPLs, but there is less research for assessing the quality of these tests by means of error detection capability. Such test assessment is based on error injection into correct version of the system under test. However to our knowledge, potential errors in SPL engineering have never been systematically identified before. This article presents an overview over existing paradigms for specifying software product lines and the errors that can occur during the respective specification processes. For assessment of test quality, we leverage mutation testing techniques to SPL engineering and implement the identified errors as mutation operators. This allows us to run existing tests against defective products for the purpose of test assessment. From the results, we draw conclusions about the error-proneness of the surveyed SPL design paradigms and how quality of SPL tests can be improved.

  18. When is Crowdsourcing Advantageous? The Case of Crowdsourced Software Testing

    DEFF Research Database (Denmark)

    Leicht, Niklas; Knop, Nicolas; Blohm, Ivo

    2016-01-01

    , we present two case studies from the domain of crowdsourced software testing. We systematically analyze two organizations that applied crowdtesting to test a mobile applcation. As both organizations tested the application via crowdtesting and their traditional in-house testing, we are able to relate...... the effectiveness of crowdtesting and the associated costs to the effective-ness and costs of in-house testing. We find that crowdtesting is comparable in terms of testing quality and costs, but provides large advantages in terms of speed, heterogeneity of testers and user feedback as added value. We contribute...

  19. Michigan Department of Transportation statewide advanced traffic management system (ATMS) procurement evaluation - phase I : software procurement.

    Science.gov (United States)

    2009-04-01

    This project evaluates the process that was followed by MDOT and other stakeholders for the acquisition : of new Advanced Traffic Management System (ATMS) software aiming to integrate and facilitate the : management of various Intelligent Transportat...

  20. Automatic Generation of Basis Component Path Coverage for Software Architecture Testing

    OpenAIRE

    Lijun Lun; Shaoting Wang; Xin Chi; Hui Xu

    2017-01-01

    Architecture-centric development is one of the most promising methods for improving software quality, reducing software cost and raising software productivity. Software architecture research not only focuses on the design phase, but also covers every phase of software life cycle. Software architecture has characteristics different from traditional software, conventional testing methods do not apply directly to software architecture. Basis path testing is a very simple and efficient white-box ...

  1. Building Software Development Capacity to Advance the State of Educational Technology

    Science.gov (United States)

    Luterbach, Kenneth J.

    2013-01-01

    Educational technologists may advance the state of the field by increasing capacity to develop software tools and instructional applications. Presently, few academic programs in educational technology require even a single computer programming course. Further, the educational technologists who develop software generally work independently or in…

  2. Advanced Control Test Operation (ACTO) facility

    International Nuclear Information System (INIS)

    Ball, S.J.

    1987-01-01

    The Advanced Control Test Operation (ACTO) project, sponsored by the US Department of Energy (DOE), is being developed to enable the latest modern technology, automation, and advanced control methods to be incorporated into nuclear power plants. The facility is proposed as a national multi-user center for advanced control development and testing to be completed in 1991. The facility will support a wide variety of reactor concepts, and will be used by researchers from Oak Ridge National Laboratory (ORNL), plus scientists and engineers from industry, other national laboratories, universities, and utilities. ACTO will also include telecommunication facilities for remote users

  3. Beautiful Testing Leading Professionals Reveal How They Improve Software

    CERN Document Server

    Goucher, Adam

    2009-01-01

    Successful software depends as much on scrupulous testing as it does on solid architecture or elegant code. But testing is not a routine process, it's a constant exploration of methods and an evolution of good ideas. Beautiful Testing offers 23 essays from 27 leading testers and developers that illustrate the qualities and techniques that make testing an art. Through personal anecdotes, you'll learn how each of these professionals developed beautiful ways of testing a wide range of products -- valuable knowledge that you can apply to your own projects. Here's a sample of what you'll find i

  4. Worlds apart: industrial and academic focus areas in software testing

    NARCIS (Netherlands)

    Garousi, V.; Felderer, Michael

    2017-01-01

    To determine how industry and academia approach software testing, researchers compared the titles of presentations from selected conferences in each of the two communities. The results shed light on the root cause of low industry-academia collaboration and led to suggestions on how to improve this

  5. Attributes Effecting Software Testing Estimation; Is Organizational Trust an Issue?

    Science.gov (United States)

    Hammoud, Wissam

    2013-01-01

    This quantitative correlational research explored the potential association between the levels of organizational trust and the software testing estimation. This was conducted by exploring the relationships between organizational trust, tester's expertise, organizational technology used, and the number of hours, number of testers, and time-coding…

  6. JET real-time project test-bench software structure

    Energy Technology Data Exchange (ETDEWEB)

    Cruz, N. [Associacao EURATOM/IST, Centro de Fusao Nuclear Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal)]. E-mail: nunocruz@lei.fis.uc.pt; Batista, A.J.N. [Associacao EURATOM/IST, Centro de Fusao Nuclear Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); Alves, D. [Associacao EURATOM/IST, Centro de Fusao Nuclear Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); Sousa, J. [Associacao EURATOM/IST, Centro de Fusao Nuclear Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); Varandas, C.A.F. [Associacao EURATOM/IST, Centro de Fusao Nuclear Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); Joffrin, E. [Association EURATOM-CEA, CEA-Cadarache, 13108 St. Paul-Lez Durance (France); Felton, R. [Association UKAEA/Euratom, Culham Science Centre, Abingdon, Oxfordshire, OX14 3DB (United Kingdom); Farthing, J.W. [Association UKAEA/Euratom, Culham Science Centre, Abingdon, Oxfordshire, OX14 3DB (United Kingdom)

    2006-07-15

    A new test-bench for the JET real-time project was developed, being capable of generating analogue and digital stimulus signals to the control systems under test using previously stored JET pulse data. This platform allows systems to be more thoroughly tested in a wide range of scenarios before going on-line on the JET machine, reducing development and maintenance times and improving systems performance and reliability. This paper describes the real-time stimulus generator. Three layers of software which were developed to completely control 32 analogue output channels and 32 ATM virtual circuits as a real-time signal generator system:-Signal processing on Digital Signal Processor (DSP) software directly accesses the programmable control logic, issuing all the necessary commands using a 64-bit control register and 8-channel rate change registers. Real-time data flow from local SDRAM to digital to analogue converter (DAC) channel circular buffers is also controlled by the DSP. Interrupt service routines (ISRs) were developed to Control Software variables, as well as DMA data transfers. -Signal generation and operation as a Linux application controls the DSP in a client-server architecture. The most important functions of this software are: (i) access the JET database via MDSplus (ii) data transfer to the local DSP SDRAM (iii) issuing commands to the DSP state machine hardware controller (iv) check DSP and hardware logic blocks status for errors and (v) the ATM link control. -Remote control operation using HTTP server running CGI scripts receives the remote configuration and commands from JET operations management software interface and passes it to the high level Linux software.

  7. Fuzzy Cognitive Map for Software Testing Using Artificial Intelligence Techniques

    OpenAIRE

    Larkman , Deane; Mohammadian , Masoud; Balachandran , Bala; Jentzsch , Ric

    2010-01-01

    International audience; This paper discusses a framework to assist test managers to evaluate the use of AI techniques as a potential tool in software testing. Fuzzy Cognitive Maps (FCMs) are employed to evaluate the framework and make decision analysis easier. A what-if analysis is presented that explores the general application of the framework. Simulations are performed to show the effectiveness of the proposed method. The framework proposed is innovative and it assists managers in making e...

  8. SLAC FASTBUS Snoop Module: test results and support software

    International Nuclear Information System (INIS)

    Gustavson, D.B.; Walz, H.V.

    1985-09-01

    The development of a diagnostic module for FASTBUS has been completed. The Snoop Module is designed to reside on a Crate Segment and provide high-speed diagnostic monitoring and testing capabilities. Final hardware details and testing of production prototype modules are reported. Features of software under development for a stand-alone single Snoop diagnostic system and Multi-Snoop networks will be discussed. 3 refs., 2 figs

  9. Advancing Software Development for a Multiprocessor System-on-Chip

    Directory of Open Access Journals (Sweden)

    Stephen Bique

    2007-06-01

    Full Text Available A low-level language is the right tool to develop applications for some embedded systems. Notwithstanding, a high-level language provides a proper environment to develop the programming tools. The target device is a system-on-chip consisting of an array of processors with only local communication. Applications include typical streaming applications for digital signal processing. We describe the hardware model and stress the advantages of a flexible device. We introduce IDEA, a graphical integrated development environment for an array. A proper foundation for software development is a UML and standard programming abstractions in object-oriented languages.

  10. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  11. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    Science.gov (United States)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult

  12. Developing Avionics Hardware and Software for Rocket Engine Testing

    Science.gov (United States)

    Aberg, Bryce Robert

    2014-01-01

    My summer was spent working as an intern at Kennedy Space Center in the Propulsion Avionics Branch of the NASA Engineering Directorate Avionics Division. The work that I was involved with was part of Rocket University's Project Neo, a small scale liquid rocket engine test bed. I began by learning about the layout of Neo in order to more fully understand what was required of me. I then developed software in LabView to gather and scale data from two flowmeters and integrated that code into the main control software. Next, I developed more LabView code to control an igniter circuit and integrated that into the main software, as well. Throughout the internship, I performed work that mechanics and technicians would do in order to maintain and assemble the engine.

  13. W-026 acceptance test plan plant control system software (submittal {number_sign} 216)

    Energy Technology Data Exchange (ETDEWEB)

    Watson, T.L., Fluor Daniel Hanford

    1997-02-14

    Acceptance Testing of the WRAP 1 Plant Control System software will be conducted throughout the construction of WRAP 1 with final testing on the glovebox software being completed in December 1996. The software tests will be broken out into five sections; one for each of the four Local Control Units and one for the supervisory software modules. The acceptance test report will contain completed copies of the software tests along with the applicable test log and completed Exception Test Reports.

  14. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  15. Advanced Mechanical Testing of Sandwich Materials

    DEFF Research Database (Denmark)

    Hayman, Brian; Berggreen, Christian; Jenstrup, Claus

    2008-01-01

    An advanced digital optical system has been used to measure surface strains on sandwich face and core specimens tested in a project concerned with improved criteria for designing sandwich X-joints. The face sheet specimens were of glass reinforced polyester and were tested in tension. The core sp...

  16. System support software for TSTA [Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-10-01

    The fact that Tritium Systems Test Assembly (TSTA) is an experimental facility makes it impossible and undesirable to try to forecast the exact software requirements. Thus the software had to be written in a manner that would allow modifications without compromising the safety requirements imposed by the handling of tritium. This suggested a multi-level approach to the software. In this approach (much like the ISO network model) each level is isolated from the level below and above by cleanly defined interfaces. For example, the subsystem support level interfaces with the subsystem hardware through the software support level. Routines in the software support level provide operations like ''OPEN VALVE'' and CLOSE VALVE'' to the subsystem level. This isolates the subsystem level from the actual hardware. This is advantageous because changes can occur in any level without the need for propagating the change to any other level. The TSTA control system consists of the hardware level, the data conversion level, the operator interface level, and the subsystem process level. These levels are described

  17. Performance testing of LiDAR exploitation software

    Science.gov (United States)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-04-01

    Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.

  18. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  19. Operability test procedure for TRUSAF assayer software upgrade

    International Nuclear Information System (INIS)

    Cejka, C.C.

    1995-01-01

    This OTP is to be used to ensure the operability of the Transuranic Waste Assay System (TRUWAS). The system was upgraded and requires a retest to assure satisfactory operation. The upgrade consists of an AST 486 computer to replace the IBM-PC/XT, and a software upgrade (CNEUT). The software calculations are performed in the same manner as in the previous system (NEUT), however, the new software is written in C Assembly Language. CNEUT is easier to use and far more powerful than the previous program. The TRUWAS is used to verify the TRU content of waste packages sent for storage in the Transuranic Storage and Assay Facility (TRUSAF). The TRUSAF is part of Westinghouse Hanford's certification program for waste to be shipped to the Waste Isolation Pilot Plant (WIPP) in New Mexico. The Transuranic Waste Assayer uses a combination passive-active neutron interrogation system to determine the TRU content of 55-gallon waste drums. The system consists of a shielded assay chamber; Deuterium-Tritium neutron generator; Helium-3 proportional counters; drum handling system; electronics including preamplifier, amplifier, and discriminator for each of the counter packages; and an AST 486 computer/printer system for data acquisition and analysis. The system can detect down to TRU levels of 10 nCi/g in the waste matrix. The equipment to be tested is: Assay Chamber Door Drum Turntable and Automatic Loading Platform Interlocks Assayer Software; and IBM computer/printer software. The objective of the test is to verify that the system is operational with the AST 486 computer, the software used in the new computer system correctly calculates TRU levels, and the new computer system is capable of storing and retrieving data

  20. Integration testing through reusing representative unit test cases for high-confidence medical software.

    Science.gov (United States)

    Shin, Youngsul; Choi, Yunja; Lee, Woo Jin

    2013-06-01

    As medical software is getting larger-sized, complex, and connected with other devices, finding faults in integrated software modules gets more difficult and time consuming. Existing integration testing typically takes a black-box approach, which treats the target software as a black box and selects test cases without considering internal behavior of each software module. Though it could be cost-effective, this black-box approach cannot thoroughly test interaction behavior among integrated modules and might leave critical faults undetected, which should not happen in safety-critical systems such as medical software. This work anticipates that information on internal behavior is necessary even for integration testing to define thorough test cases for critical software and proposes a new integration testing method by reusing test cases used for unit testing. The goal is to provide a cost-effective method to detect subtle interaction faults at the integration testing phase by reusing the knowledge obtained from unit testing phase. The suggested approach notes that the test cases for the unit testing include knowledge on internal behavior of each unit and extracts test cases for the integration testing from the test cases for the unit testing for a given test criteria. The extracted representative test cases are connected with functions under test using the state domain and a single test sequence to cover the test cases is produced. By means of reusing unit test cases, the tester has effective test cases to examine diverse execution paths and find interaction faults without analyzing complex modules. The produced test sequence can have test coverage as high as the unit testing coverage and its length is close to the length of optimal test sequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Performance testing of 3D point cloud software

    Directory of Open Access Journals (Sweden)

    M. Varela-González

    2013-10-01

    Full Text Available LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI. The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  2. Analýza software testing nástrojů

    OpenAIRE

    Žabka, Michael

    2015-01-01

    This thesis deals with a design of a software testing tool comparison process and a follow-up demonstration of the designed comparison process on three selected software testing tools. Designed process of comparison is set in a theoretical context of software testing. The main goals of thesis are the design of a test management software testing tool compari-son process and a demonstration of designed comparison process on three selected softwa-re testing tools. In theoretical part of the thes...

  3. The Witch Navigator - A Low Cost GNSS Software Receiver for Advanced Processing Techniques

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2010-12-01

    Full Text Available The developement of advanced GNSS signal processing algorithms such as multi-constellation, multi-frequency and multi-antenna navigation requires an easily reprogrammable software defined radio solution. Various receiver architectures for this purpose have been introduced. RF front-end with FPGA universal correlators on ExpressCard connected directly to PC was selected and manufactured. Such a~unique hardware combination provides the GNSS researchers and engineers with a~great convenience of writing the signal processing algorithms including tracking, acquisition and positioning in the Linux application programming interface and enables them to reconfigure the RF front-end easily by the PC program. With more of these ExpressCards connected to the PC, the number of the RF channels, correlators or antennas can be increased to further boost the computational power. This paper reveals the implementation aspects of the receiver, named the Witch Navigator, and~gives the key test results.

  4. cit: hypothesis testing software for mediation analysis in genomic applications.

    Science.gov (United States)

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  6. Advanced Stirling Convertor Testing at GRC

    Science.gov (United States)

    Schifer, Nick; Oriti, Salvatore M.

    2013-01-01

    NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). The latest version of the ASC, deemed ASC-E3, is of a design identical to the forthcoming flight convertors. The first pair of ASC-E3 units was delivered in December 2012. GRC has begun the process of adding these units to the catalog of ongoing Stirling convertor operation. This process includes performance verification, which examines the data from various tests to validate the convertors performance to the product specification.

  7. Software Considerations for Subscale Flight Testing of Experimental Control Laws

    Science.gov (United States)

    Murch, Austin M.; Cox, David E.; Cunningham, Kevin

    2009-01-01

    The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.

  8. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    Energy Technology Data Exchange (ETDEWEB)

    Chu, Tsong-Lun [Brookhaven National Lab. (BNL), Upton, NY (United States); Varuttamaseni, Athi [Brookhaven National Lab. (BNL), Upton, NY (United States); Baek, Joo-Seok [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-11-01

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).

  9. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    International Nuclear Information System (INIS)

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    2016-01-01

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).

  10. ARTENOLIS: Automated Reproducibility and Testing Environment for Licensed Software

    OpenAIRE

    Heirendt, Laurent; Arreckx, Sylvain; Trefois, Christophe; Yarosz, Yohan; Vyas, Maharshi; Satagopam, Venkata P.; Schneider, Reinhard; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-01

    Motivation: Automatically testing changes to code is an essential feature of continuous integration. For open-source code, without licensed dependencies, a variety of continuous integration services exist. The COnstraint-Based Reconstruction and Analysis (COBRA) Toolbox is a suite of open-source code for computational modelling with dependencies on licensed software. A novel automated framework of continuous integration in a semi-licensed environment is required for the development of the COB...

  11. Parallel-Processing Test Bed For Simulation Software

    Science.gov (United States)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  12. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Sung Jiun

    2014-01-01

    The V and V method has been utilized for this safety critical software, while SRGM has difficulties because of lack of failure occurrence data on developing phase. For the safety critical software, however, failure data cannot be gathered after installation in real plant when we consider the severe consequence. Therefore, to complement the V and V method, the test-based method need to be developed. Some studies on test-based reliability quantification method for safety critical software have been conducted in nuclear field. These studies provide useful guidance on generating test sets. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values. The other problem is number of tests needed. To satisfy a target reliability with reasonable confidence level, very large number of test sets are required. Development of this number of test sets is a herculean

  13. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    Science.gov (United States)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  14. LHCb - Automated Testing Infrastructure for the Software Framework Gaudi

    CERN Multimedia

    Clemencic, M

    2009-01-01

    An extensive test suite is the first step towards the delivery of robust software, but it is not always easy to implement it, especially in projects with many developers. An easy to use and flexible infrastructure to use to write and execute the tests reduces the work each developer has to do to instrument his packages with tests. At the same time, the infrastructure gives the same look and feel to the tests and allows automated execution of the test suite. For Gaudi, we decided to develop the testing infrastructure on top of the free tool QMTest, used already in LCG Application Area for the routine tests run in the nightly build system. The high flexibility of QMTest allowed us to integrate it in the Gaudi package structure. A specialized test class and some utility functions have been developed to simplify the definition of a test for a Gaudi-based application. Thanks to the testing infrastructure described here, we managed to quickly extend the standard Gaudi test suite and add tests to the main LHCb appli...

  15. CONFU: Configuration Fuzzing Testing Framework for Software Vulnerability Detection.

    Science.gov (United States)

    Dai, Huning; Murphy, Christian; Kaiser, Gail

    2010-01-01

    Many software security vulnerabilities only reveal themselves under certain conditions, i.e., particular configurations and inputs together with a certain runtime environment. One approach to detecting these vulnerabilities is fuzz testing. However, typical fuzz testing makes no guarantees regarding the syntactic and semantic validity of the input, or of how much of the input space will be explored. To address these problems, we present a new testing methodology called Configuration Fuzzing. Configuration Fuzzing is a technique whereby the configuration of the running application is mutated at certain execution points, in order to check for vulnerabilities that only arise in certain conditions. As the application runs in the deployment environment, this testing technique continuously fuzzes the configuration and checks "security invariants" that, if violated, indicate a vulnerability. We discuss the approach and introduce a prototype framework called ConFu (CONfiguration FUzzing testing framework) for implementation. We also present the results of case studies that demonstrate the approach's feasibility and evaluate its performance.

  16. Unit Testing for the Application Control Language (ACL) Software

    Science.gov (United States)

    Heinich, Christina Marie

    2014-01-01

    In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.

  17. Advanced Demonstration and Test Reactor Options Study

    Energy Technology Data Exchange (ETDEWEB)

    Petti, David Andrew [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hill, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Gehin, J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Heidet, F. [Argonne National Lab. (ANL), Argonne, IL (United States); Kinsey, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Grandy, Christopher [Argonne National Lab. (ANL), Argonne, IL (United States); Qualls, A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoffman, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Croson, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-01-01

    Global efforts to address climate change will require large-scale decarbonization of energy production in the United States and elsewhere. Nuclear power already provides 20% of electricity production in the United States (U.S.) and is increasing in countries undergoing rapid growth around the world. Because reliable, grid-stabilizing, low emission electricity generation, energy security, and energy resource diversity will be increasingly valued, nuclear power’s share of electricity production has a potential to grow. In addition, there are non electricity applications (e.g., process heat, desalination, hydrogen production) that could be better served by advanced nuclear systems. Thus, the timely development, demonstration, and commercialization of advanced nuclear reactors could diversify the nuclear technologies available and offer attractive technology options to expand the impact of nuclear energy for electricity generation and non-electricity missions. The purpose of this planning study is to provide transparent and defensible technology options for a test and/or demonstration reactor(s) to be built to support public policy, innovation and long term commercialization within the context of the Department of Energy’s (DOE’s) broader commitment to pursuing an “all of the above” clean energy strategy and associated time lines. This planning study includes identification of the key features and timing needed for advanced test or demonstration reactors to support research, development, and technology demonstration leading to the commercialization of power plants built upon these advanced reactor platforms. This planning study is consistent with the Congressional language contained within the fiscal year 2015 appropriation that directed the DOE to conduct a planning study to evaluate “advanced reactor technology options, capabilities, and requirements within the context of national needs and public policy to support innovation in nuclear energy

  18. Integrating Testing into Software Engineering Courses Supported by a Collaborative Learning Environment

    Science.gov (United States)

    Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.

    2014-01-01

    As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…

  19. Automatically generated acceptance test: A software reliability experiment

    Science.gov (United States)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  20. Test documentation for the GENII Software Version 1.485

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1994-01-01

    Version 1.485 of the GENII software was released by the PNL GENII custodian in December of 1990. At that time the WHC GENII custodian performed several tests to verify that the advertised revisions were indeed present and that these changes had not introduced errors in the calculations normally done by WHC. These tests were not documented at that time. The purpose of this document is to summarize suitable acceptance tests of GENII and compare them with a few hand calculations. The testing is not as thorough as that used by the PNL GENII Custodian, but is sufficient to establish that the GENII program appears to work correctly on WHC managed personal computers

  1. Analysis of Testing and Operational Software Reliability in SRGM based on NHPP

    OpenAIRE

    S. Thirumurugan; D. R. Prince Williams

    2007-01-01

    Software Reliability is one of the key factors in the software development process. Software Reliability is estimated using reliability models based on Non Homogenous Poisson Process. In most of the literature the Software Reliability is predicted only in testing phase. So it leads to wrong decision-making concept. In this paper, two Software Reliability concepts, testing and operational phase are studied in detail. Using S-Shaped Software Reliability Growth Model (SRGM) and Exponential SRGM,...

  2. Advanced Instrumentation for Transient Reactor Testing

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, Michael L.; Anderson, Mark; Imel, George; Blue, Tom; Roberts, Jeremy; Davis, Kurt

    2018-01-31

    Transient testing involves placing fuel or material into the core of specialized materials test reactors that are capable of simulating a range of design basis accidents, including reactivity insertion accidents, that require the reactor produce short bursts of intense highpower neutron flux and gamma radiation. Testing fuel behavior in a prototypic neutron environment under high-power, accident-simulation conditions is a key step in licensing nuclear fuels for use in existing and future nuclear power plants. Transient testing of nuclear fuels is needed to develop and prove the safety basis for advanced reactors and fuels. In addition, modern fuel development and design increasingly relies on modeling and simulation efforts that must be informed and validated using specially designed material performance separate effects studies. These studies will require experimental facilities that are able to support variable scale, highly instrumented tests providing data that have appropriate spatial and temporal resolution. Finally, there are efforts now underway to develop advanced light water reactor (LWR) fuels with enhanced performance and accident tolerance. These advanced reactor designs will also require new fuel types. These new fuels need to be tested in a controlled environment in order to learn how they respond to accident conditions. For these applications, transient reactor testing is needed to help design fuels with improved performance. In order to maximize the value of transient testing, there is a need for in-situ transient realtime imaging technology (e.g., the neutron detection and imaging system like the hodoscope) to see fuel motion during rapid transient excursions with a higher degree of spatial and temporal resolution and accuracy. There also exists a need for new small, compact local sensors and instrumentation that are capable of collecting data during transients (e.g., local displacements, temperatures, thermal conductivity, neutron flux, etc.).

  3. Development of remote control software for multiformat test signal generator

    Directory of Open Access Journals (Sweden)

    Gao Yang

    2017-01-01

    Full Text Available The multi format test signal generator mentioned in this paper is the video signal generator named TG8000 produced by Tektronix Company. I will introduce the function about remote control for signal generator, how to connect the computer to the instrument, and how to remote control. My topic uses my computer to connect the instrument through the 10/100/1000 BASE-T port on the rear panel of TG8000. Then I write program to transmit SCPI (Standard Commands for Programmable Instruments to control TG8000. The application is running on the Windows operating system, the programming language is C#, development environment is Microsoft Visual Studio 2010, using the TCP/IP protocol based on Socket. And the method of remote control refers to the application called TGSetup which is developed by Tektronix Company. This paper includes a brief summary of the basic principle, and introduce for details about the process of remote control software development, and how to use my software. In the end, I will talk about the advantages of my software compared with another one.

  4. Launch Control System Software Development System Automation Testing

    Science.gov (United States)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR

  5. Future Transient Testing of Advanced Fuels

    Energy Technology Data Exchange (ETDEWEB)

    Jon Carmack

    2009-09-01

    The transient in-reactor fuels testing workshop was held on May 4–5, 2009 at Idaho National Laboratory. The purpose of this meeting was to provide a forum where technical experts in transient testing of nuclear fuels could meet directly with technical instrumentation experts and nuclear fuel modeling and simulation experts to discuss needed advancements in transient testing to support a basic understanding of nuclear fuel behavior under off-normal conditions. The workshop was attended by representatives from Commissariat à l'Énergie Atomique CEA, Japanese Atomic Energy Agency (JAEA), Department of Energy (DOE), AREVA, General Electric – Global Nuclear Fuels (GE-GNF), Westinghouse, Electric Power Research Institute (EPRI), universities, and several DOE national laboratories. Transient testing of fuels and materials generates information required for advanced fuels in future nuclear power plants. Future nuclear power plants will rely heavily on advanced computer modeling and simulation that describes fuel behavior under off-normal conditions. TREAT is an ideal facility for this testing because of its flexibility, proven operation and material condition. The opportunity exists to develop advanced instrumentation and data collection that can support modeling and simulation needs much better than was possible in the past. In order to take advantage of these opportunities, test programs must be carefully designed to yield basic information to support modeling before conducting integral performance tests. An early start of TREAT and operation at low power would provide significant dividends in training, development of instrumentation, and checkout of reactor systems. Early start of TREAT (2015) is needed to support the requirements of potential users of TREAT and include the testing of full length fuel irradiated in the FFTF reactor. The capabilities provided by TREAT are needed for the development of nuclear power and the following benefits will be realized by

  6. Future Transient Testing of Advanced Fuels

    International Nuclear Information System (INIS)

    Carmack, Jon

    2009-01-01

    The transient in-reactor fuels testing workshop was held on May 4-5, 2009 at Idaho National Laboratory. The purpose of this meeting was to provide a forum where technical experts in transient testing of nuclear fuels could meet directly with technical instrumentation experts and nuclear fuel modeling and simulation experts to discuss needed advancements in transient testing to support a basic understanding of nuclear fuel behavior under off-normal conditions. The workshop was attended by representatives from Commissariat energie Atomique CEA, Japanese Atomic Energy Agency (JAEA), Department of Energy (DOE), AREVA, General Electric - Global Nuclear Fuels (GE-GNF), Westinghouse, Electric Power Research Institute (EPRI), universities, and several DOE national laboratories. Transient testing of fuels and materials generates information required for advanced fuels in future nuclear power plants. Future nuclear power plants will rely heavily on advanced computer modeling and simulation that describes fuel behavior under off-normal conditions. TREAT is an ideal facility for this testing because of its flexibility, proven operation and material condition. The opportunity exists to develop advanced instrumentation and data collection that can support modeling and simulation needs much better than was possible in the past. In order to take advantage of these opportunities, test programs must be carefully designed to yield basic information to support modeling before conducting integral performance tests. An early start of TREAT and operation at low power would provide significant dividends in training, development of instrumentation, and checkout of reactor systems. Early start of TREAT (2015) is needed to support the requirements of potential users of TREAT and include the testing of full length fuel irradiated in the FFTF reactor. The capabilities provided by TREAT are needed for the development of nuclear power and the following benefits will be realized by the

  7. The predictive information obtained by testing multiple software versions

    Science.gov (United States)

    Lee, Larry D.

    1987-01-01

    Multiversion programming is a redundancy approach to developing highly reliable software. In applications of this method, two or more versions of a program are developed independently by different programmers and the versions are combined to form a redundant system. One variation of this approach consists of developing a set of n program versions and testing the versions to predict the failure probability of a particular program or a system formed from a subset of the programs. The precision that might be obtained, and also the effect of programmer variability if predictions are made over repetitions of the process of generating different program versions, are examined.

  8. Software design for the Tritium System Test Assembly

    International Nuclear Information System (INIS)

    Claborn, G.W.; Heaphy, R.T.; Lewis, P.S.; Mann, L.W.; Nielson, C.W.

    1983-01-01

    The control system for the Tritium Systems Test Assembly (TSTA) must execute complicated algorithms for the control of several sophisticated subsystems. It must implement this control with requirements for easy modifiability, for high availability, and provide stringent protection for personnel and the environment. Software techniques used to deal with these requirements are described, including modularization based on the structure of the physical systems, a two-level hierarchy of concurrency, a dynamically modifiable man-machine interface, and a specification and documentation language based on a computerized form of structured flowcharts

  9. What We Know about Software Test Maturity and Test Process Improvement

    NARCIS (Netherlands)

    Garousi, Vahid; Felderer, Michael; Hacaloglu, Tuna

    2018-01-01

    In many companies, software testing practices and processes are far from mature and are usually conducted in an ad hoc fashion. Such immature practices lead to negative outcomes - for example, testing that doesn't detect all the defects or that incurs cost and schedule overruns. To conduct test

  10. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  11. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  12. Instrumentation to Enhance Advanced Test Reactor Irradiations

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Rempe; D. L. Knudson; K. G. Condie; J. E. Daw; S. C. Taylor

    2009-09-01

    The Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF) in April 2007 to support U.S. leadership in nuclear science and technology. By attracting new research users - universities, laboratories, and industry - the ATR will support basic and applied nuclear research and development, further advancing the nation's energy security needs. A key component of the ATR NSUF effort is to prove new in-pile instrumentation techniques that are capable of providing real-time measurements of key parameters during irradiation. To address this need, an assessment of instrumentation available and under-development at other test reactors has been completed. Based on this review, recommendations are made with respect to what instrumentation is needed at the ATR and a strategy has been developed for obtaining these sensors. Progress toward implementing this strategy is reported in this document. It is anticipated that this report will be updated on an annual basis.

  13. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  14. Safety review on unit testing of safety system software of nuclear power plant

    International Nuclear Information System (INIS)

    Liu Le; Zhang Qi

    2013-01-01

    Software unit testing has an important place in the testing of safety system software of nuclear power plants, and in the wider scope of the verification and validation. It is a comprehensive, systematic process, and its documentation shall meet the related requirements. When reviewing software unit testing, attention should be paid to the coverage of software safety requirements, the coverage of software internal structure, and the independence of the work. (authors)

  15. Standardization of Tests for Advanced Composites

    OpenAIRE

    石川, 隆司; ISHIKAWA, Takashi; 野口, 義男; NOGUCHI, Yoshio; 濱口, 泰正; HAMAGUCHI, Yasumasa

    2003-01-01

    Advanced composites are essentially the only feasible materials for the construction of newly developed aerospace vehicle. However, the path to be followed for the validation, evaluation and certification of composite aircraft structures is quite different from that of traditional metallic aircraft structures, and the importance of a composites database is now well recognized. A key issue in constructing a fully descriptive composites database is to establish standard composite test methods, ...

  16. Mastering Kali Linux for advanced penetration testing

    CERN Document Server

    Beggs, Robert W

    2014-01-01

    This book provides an overview of the kill chain approach to penetration testing, and then focuses on using Kali Linux to provide examples of how this methodology is applied in the real world. After describing the underlying concepts, step-by-step examples are provided that use selected tools to demonstrate the techniques. If you are an IT professional or a security consultant who wants to maximize the success of your network testing using some of the advanced features of Kali Linux, then this book is for you. This book will teach you how to become an expert in the pre-engagement, management,

  17. Stroop test software. The Tastiva proposal (Software para pruebas Stroop. La propuesta de Tastiva

    Directory of Open Access Journals (Sweden)

    María Claudia Scurtu

    2016-08-01

    Full Text Available There has been a great deal of research on emotional information processing within the field of clinical psychology. Many tests have been developed and the emotional Stroop test is one of the most used. However, some versions of the Stroop test have methodological issues when used to study word-colour interferences, especially when the words are emotionally charged. We present a computer-assisted version of the emotional Stroop test called Tastiva, which is highly versatile, useful, and accessible, in addition to being easy to use and widely applicable. The Tastiva software and User Manual is available on the University of Seville website: http://grupo.us.es/recursos/Tastiva/index.htm. We also present a case study using neutral and sexual content words, in which the program calculates the word exposure time by analysing the behaviour of the respondent. One of its novel contributions is the graphic presentation of measures: response time, errors, and non-response to stimuli.

  18. Freight Advanced Traveler Information System (FRATIS) Dallas-Fort Worth : software architecture design and implementation options.

    Science.gov (United States)

    2013-05-01

    This document describes the Software Architecture Design and Implementation Options for FRATIS : system. The demonstration component of this task will serve to test the technical feasibility of the : FRATIS prototype while also facilitating the colle...

  19. Model-based testing of software product lines

    OpenAIRE

    Metzger, Andreas; Pohl, Klaus; Reis, Sacha; Reuys, A

    2006-01-01

    peer-reviewed Due to the rising demand for individualised software products and software-intensive systems (e.g.,mobile phone or automotive software), organizations are faced with the challenge to provide a diversity of software systems at low costs, in short time, and with high quality. Software product line engineering is the approach for tackling this challenge and has proven its effectiveness in numerous industrial success stories, including Siemens, ABB, Boeing, Hewlett-Packard, Phili...

  20. STEM - software test and evaluation methods. A study of failure dependency in diverse software

    International Nuclear Information System (INIS)

    Bishop, P.G.; Pullen, F.D.

    1989-02-01

    STEM is a collaborative software reliability project undertaken in partnership with Halden Reactor Project, UKAEA, and the Finnish Technical Research Centre. The objective of STEM is to evaluate a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report presents a study of the observed failure dependencies between faults in diversely produced software. (author)

  1. Advanced Demonstration and Test Reactor Options Study

    International Nuclear Information System (INIS)

    Petti, David Andrew; Hill, R.; Gehin, J.; Gougar, Hans David; Strydom, Gerhard; Heidet, F.; Kinsey, J.; Grandy, Christopher; Qualls, A.; Brown, Nicholas; Powers, J.; Hoffman, E.; Croson, D.

    2017-01-01

    Global efforts to address climate change will require large-scale decarbonization of energy production in the United States and elsewhere. Nuclear power already provides 20% of electricity production in the United States (U.S.) and is increasing in countries undergoing rapid growth around the world. Because reliable, grid-stabilizing, low emission electricity generation, energy security, and energy resource diversity will be increasingly valued, nuclear power's share of electricity production has a potential to grow. In addition, there are non electricity applications (e.g., process heat, desalination, hydrogen production) that could be better served by advanced nuclear systems. Thus, the timely development, demonstration, and commercialization of advanced nuclear reactors could diversify the nuclear technologies available and offer attractive technology options to expand the impact of nuclear energy for electricity generation and non-electricity missions. The purpose of this planning study is to provide transparent and defensible technology options for a test and/or demonstration reactor(s) to be built to support public policy, innovation and long term commercialization within the context of the Department of Energy's (DOE's) broader commitment to pursuing an 'all of the above' clean energy strategy and associated time lines. This planning study includes identification of the key features and timing needed for advanced test or demonstration reactors to support research, development, and technology demonstration leading to the commercialization of power plants built upon these advanced reactor platforms. This planning study is consistent with the Congressional language contained within the fiscal year 2015 appropriation that directed the DOE to conduct a planning study to evaluate 'advanced reactor technology options, capabilities, and requirements within the context of national needs and public policy to support innovation in nuclear

  2. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  3. Irradiation Facilities at the Advanced Test Reactor

    International Nuclear Information System (INIS)

    S. Blaine Grover

    2005-01-01

    The Advanced Test Reactor (ATR) is the third generation and largest test reactor built in the Reactor Technology Complex (RTC) (formerly known as the Test Reactor Area), located at the Idaho National Laboratory (INL), to study the effects of intense neutron and gamma radiation on reactor materials and fuels. The RTC was established in the early 1950s with the development of the Materials Testing Reactor (MTR), which operated until 1970. The second major reactor was the Engineering Test Reactor (ETR), which operated from 1957 to 1981, and finally the ATR, which began operation in 1967 and will continue operation well into the future. These reactors have produced a significant portion of the world's data on materials response to reactor environments. The wide range of experiment facilities in the ATR and the unique ability to vary the neutron flux in different areas of the core allow numerous experiment conditions to co-exist during the same reactor operating cycle. Simple experiments may involve a non-instrumented capsule containing test specimens with no real-time monitoring or control capabilities. More sophisticated testing facilities include inert gas temperature control systems and pressurized water loops that have continuous chemistry, pressure, temperature, and flow control as well as numerous test specimen monitoring capabilities. There are also apparatus that allow for the simulation of reactor transients on test specimens

  4. Advanced Test Reactor National Scientific User Facility

    Energy Technology Data Exchange (ETDEWEB)

    Frances M. Marshall; Jeff Benson; Mary Catherine Thelen

    2011-08-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is a large test reactor for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The ATR is a pressurized, light-water, high flux test reactor with a maximum operating power of 250 MWth. The INL also has several hot cells and other laboratories in which irradiated material can be examined to study material irradiation effects. In 2007 the US Department of Energy (DOE) designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR and the associated INL laboratories for material testing research by a broader user community. This paper highlights the ATR NSUF research program and the associated educational initiatives.

  5. The Testing Strategy for the Embedded Software implemented in I/O module of KNICS PLC

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Gyun; Park, Won Man; Lee, Dong Young [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The safety Grade PLC (POSAFE) is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) R and D project. The PLC is being designed for satisfies Safety Class 1E, Quality Class 1, and Seismic Category I. The embedded software for implementation in I/O module such as the pIAOS and pOAOS is being developed according to the safety critical software life cycle. The developed software according to the software life cycle is tested for verification and validation by an independent software testing team. This paper describes the software testing strategy to find the faults that may exist in software design and code effectively.

  6. Distributed Sensor Network Software Development Testing through Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Sean M. [Univ. of New Mexico, Albuquerque, NM (United States)

    2003-12-01

    The distributed sensor network (DSN) presents a novel and highly complex computing platform with dif culties and opportunities that are just beginning to be explored. The potential of sensor networks extends from monitoring for threat reduction, to conducting instant and remote inventories, to ecological surveys. Developing and testing for robust and scalable applications is currently practiced almost exclusively in hardware. The Distributed Sensors Simulator (DSS) is an infrastructure that allows the user to debug and test software for DSNs independent of hardware constraints. The exibility of DSS allows developers and researchers to investigate topological, phenomenological, networking, robustness and scaling issues, to explore arbitrary algorithms for distributed sensors, and to defeat those algorithms through simulated failure. The user speci es the topology, the environment, the application, and any number of arbitrary failures; DSS provides the virtual environmental embedding.

  7. Irradiation facilitates at the advanced test reactor

    International Nuclear Information System (INIS)

    Grover, Blaine S.

    2006-01-01

    The Advanced Test Reactor (ATR) is the third generation and largest test reactor built in the Reactor Technology Complex (RTC - formerly known as the Test Reactor Area), located at the Idaho National Laboratory (INL), to study the effects of intense neutron and gamma radiation on reactor materials and fuels. The RTC was established in the early 1950's with the development of the Materials Testing Reactor (MTR), which operated until 1970. The second major reactor was the Engineering Test Reactor (ETR), which operated from 1957 to 1981, and finally the ATR, which began operation in 1967 and will continue operation well into the future. These reactors have produced a significant portion of the world's data on materials response to reactor environments. The wide range of experiment facilities in the ATR and the unique ability to vary the neutron flux in different areas of the core allow numerous experiment conditions to co-exist during the same reactor operating cycle. Simple experiments may involve a non-instrumented capsule containing test specimens with no real-time monitoring or control capabilities. More sophisticated testing facilities include inert gas temperature control systems and pressurized water loops that have continuous chemistry, pressure, temperature, and flow control as well as numerous test specimen monitoring capabilities. There are also apparatus that allow for the simulation of reactor transients on test specimens. The paper has the following contents: ATR description and capabilities; ATR operations, quality and safety requirements; Static capsule experiments; Lead experiments; Irradiation test vehicle; In-pile loop experiments; Gas test loop; Future testing; Support facilities at RTC; Conclusions. To summarize, the ATR has a long history in fuel and material irradiations, and will be fulfilling a critical role in the future fuel and material testing necessary to develop the next generation reactor systems and advanced fuel cycles. The

  8. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  9. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    El-Bordany Ayman; Yun, Won Young

    2014-01-01

    It reads inputs, computes new states, and updates output for each scan cycle. Korea Nuclear Instrumentation and Control System (KNICS) has recently developed a fully digitalized Reactor Protection System (RPS) based on PLD. As a digital system, this RPS is equipped with a dedicated software. The Reliability of this software is crucial to NPPs safety where its malfunction may cause irreversible consequences and affect the whole system as a Common Cause Failure (CCF). To guarantee the reliability of the whole system, the reliability of this software needs to be quantified. There are three representative methods for software reliability quantification, namely the Verification and Validation (V and V) quality-based method, the Software Reliability Growth Model (SRGM), and the test-based method. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values

  10. Thermal modelling of Advanced LIGO test masses

    International Nuclear Information System (INIS)

    Wang, H; Dovale Álvarez, M; Mow-Lowry, C M; Freise, A; Blair, C; Brooks, A; Kasprzack, M F; Ramette, J; Meyers, P M; Kaufer, S; O’Reilly, B

    2017-01-01

    High-reflectivity fused silica mirrors are at the epicentre of today’s advanced gravitational wave detectors. In these detectors, the mirrors interact with high power laser beams. As a result of finite absorption in the high reflectivity coatings the mirrors suffer from a variety of thermal effects that impact on the detectors’ performance. We propose a model of the Advanced LIGO mirrors that introduces an empirical term to account for the radiative heat transfer between the mirror and its surroundings. The mechanical mode frequency is used as a probe for the overall temperature of the mirror. The thermal transient after power build-up in the optical cavities is used to refine and test the model. The model provides a coating absorption estimate of 1.5–2.0 ppm and estimates that 0.3 to 1.3 ppm of the circulating light is scattered onto the ring heater. (paper)

  11. Software for Preprocessing Data from Rocket-Engine Tests

    Science.gov (United States)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  12. Software for Displaying High-Frequency Test Data

    Science.gov (United States)

    Elmore, Jason L.

    2003-01-01

    An easy-to-use, intuitive computer program was written to satisfy a need of test operators and data requestors to quickly view and manipulate high-frequency test data recorded at the East and West Test Areas at Marshall Space Flight Center. By enabling rapid analysis, this program makes it possible to reduce times between test runs, thereby potentially reducing the overall cost of test operations. The program can be used to perform quick frequency analysis, using multiple fast- Fourier-transform windowing and amplitude options. The program can generate amplitude-versus-time plots with full zoom capabilities, frequency-component plots at specified time intervals, and waterfall plots (plots of spectral intensity versus frequency at successive small time intervals, showing the changing frequency components over time). There are options for printing of the plots and saving plot data as text files that can be imported into other application programs. The program can perform all of the aforementioned plotting and plot-data-handling functions on a relatively inexpensive computer; other software that performs the same functions requires computers with large amounts of power and memory.

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  14. Forensics of Software Copyright Infringement Crimes: The Modern POSAR Test Juxtaposed With the Dated AFC Test

    Directory of Open Access Journals (Sweden)

    Vinod Polpaya Bhattathiripad

    2014-09-01

    Full Text Available This paper presents a new development in the forensics of software copyright through a juxtaposed comparison between the proven AFC test and the recent POSAR test, the two forensic procedures for establishing software copyright infringement cases. First, the paper separately overviews the 3-stage, linear sequential AFC test and then the 5-phase, cyclic POSAR test (as AFC’s logical extension. The paper then compares the processes involved in each of the 5 phases of the POSAR test with the processes involved in the 3 stages in the AFC test, for the benefit of forensic practitioners and researchers. Finally, the paper discusses some common areas where both the tests will need careful handling while implementing them in the judiciaries across the world.

  15. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  16. Advanced integrated battery testing and simulation

    Science.gov (United States)

    Liaw, Bor Yann; Bethune, Keith P.; Yang, Xiao Guang

    The recent rapid expansion in the use of portable electronics, computers, personal data assistants, cellular phones, power tools, and even electric and hybrid vehicles creates a strong demand on fast deployment of battery technologies at an unprecedented rate. To facilitate such a development integrated battery testing and simulation (IBTS) using computer modeling is an effective tool to improve our capability of rapid prototyping battery technology and facilitating concurrent product development. In this paper, we will present a state-of-the-art approach to use IBTS for improvements in battery cell design, operation optimization, and even charge control for advanced batteries.

  17. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    Science.gov (United States)

    Wehner, Walter S., Jr.

    2013-01-01

    Working on the ACLO (Autonomous Cryogenics Loading Operations) project I have had the opportunity to add functionality to the physics simulation software known as KATE (Knowledgebase Autonomous Test Engineer), create a new application allowing WYSIWYG (what-you-see-is-what-you-get) creation of KATE schematic files and begin a preliminary design and implementation of a new subsystem that will provide vision services on the IHM (Integrated Health Management) bus. The functionality I added to KATE over the past few months includes a dynamic visual representation of the fluid height in a pipe based on number of gallons of fluid in the pipe and implementing the IHM bus connection within KATE. I also fixed a broken feature in the system called the Browser Display, implemented many bug fixes and made changes to the GUI (Graphical User Interface).

  18. The advanced test reactor strategic evaluation program

    International Nuclear Information System (INIS)

    Buescher, B.J.

    1989-01-01

    Since the Chernobly accident, the safety of test reactors and irradiation facilities has been critically evaluated from the public's point of view. A systematic evaluation of all safety, environmental, and operational issues must be made in an integrated manner to prioritize actions to maximize benefits while minimizing costs. Such a proactive program has been initiated at the Advanced Test Reactor (ATR). This program, called the Strategic Evaluation Program (STEP), is being conducted for the ATR to provide integrated safety and operational reviews of the reactor against the standards applied to licensed commercial power reactors. This has taken into consideration the lessons learned by the US Nuclear Regulatory Commission (NRC) in its Systematic Evaluation Program (SEP) and the follow-on effort known as the Integrated Safety Assessment Program (ISAP). The SEP was initiated by the NRC to review the designs of older operating nuclear power plants to confirm and document their safety. The ATR STEP objectives are discussed

  19. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    Energy Technology Data Exchange (ETDEWEB)

    Kostadin, Damevski [Virginia State Univ., Petersburg, VA (United States)

    2015-01-25

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  20. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  1. Introduction to Lean Canvas Transformation Models and Metrics in Software Testing

    Directory of Open Access Journals (Sweden)

    Nidagundi Padmaraj

    2016-05-01

    Full Text Available Software plays a key role nowadays in all fields, from simple up to cutting-edge technologies and most of technology devices now work on software. Software development verification and validation have become very important to produce the high quality software according to business stakeholder requirements. Different software development methodologies have given a new dimension for software testing. In traditional waterfall software development software testing has approached the end point and begins with resource planning, a test plan is designed and test criteria are defined for acceptance testing. In this process most of test plan is well documented and it leads towards the time-consuming processes. For the modern software development methodology such as agile where long test processes and documentations are not followed strictly due to small iteration of software development and testing, lean canvas transformation models can be a solution. This paper provides a new dimension to find out the possibilities of adopting the lean transformation models and metrics in the software test plan to simplify the test process for further use of these test metrics on canvas.

  2. A software program for the head impulse testing device (HITD).

    Science.gov (United States)

    Bohler, A; Mandala, M; Ramat, S

    2010-01-01

    The vestibulo-ocular reflex (VOR) uses head angular acceleration information transduced by the semicircular canals in the inner ear in order to drive eye movements that compensate for head rotations, and thus stabilize the visual scene on the retina. Peripheral and central vestibular pathologies may impair the function of the VOR, so that compensation becomes incomplete, making clear vision during head movement impossible. Powerful adaptive mechanisms quickly allow the central nervous system to use residual vestibular information or information provided through other senses to supplement the deficient VOR. Such recovery makes the clinical diagnosis difficult to classical testing techniques, yet the head impulse test allows to reveal vestibular deficits even in adapted patients. A compensatory saccade at the end of the head movement is the clinical sign of a vestibular deficit, and may be spotted by the experienced clinician. Here we describe the rationale and the software program driving a new computerized technique for reliably assessing vestibular function at different head angular accelerations, based on evaluating the ability of the patient in reading a character on the screen while the head is being rotated.

  3. Automated Testing Techniques for Event-Driven and Dynamically Typed Software Applications

    DEFF Research Database (Denmark)

    Adamsen, Christoffer Quist

    2018-01-01

    Software testing is the process of executing a software application on a set of inputs, and determining if the application behaves as intended on these inputs. This thesis focuses on testing techniques for event-driven and dynamically typed applications. This is an important class of software, wh...

  4. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    International Nuclear Information System (INIS)

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. (1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. Although very functional, this system is not portable or flexible; the software would have to be substantially rewritten for other applications. (2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. This package is based on a standardized choice of hardware, within which it is capable of building a system to order, automatically constructing graphics, data tables, alarm prioritization rules, and interfaces to peripherals. (3) A software tool, the User Interface Management System (UIMS), is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display. The object-oriented software of the UIMS achieves rapid prototyping of a new interface by standardizing to a class library of software objects instead of hardware objects

  5. The development of test software for the inadequate core cooling monitoring system

    International Nuclear Information System (INIS)

    Lee, Soon Sung.

    1996-06-01

    The test software including the ICCMS simulator which is necessary for dynamic test for the ICCMS software in PWR is developed. The developed dynamic test software consists of the module test simulator, the integration test simulator, and the test result analyser. The simulator was programmed by C language according to the same algorithm requirements for the FORTRAN version ICCMS software, and also for the Factory Acceptance Test (FAT). And the simulator can be used as training tool for the reactor operator and system development tool for the performance improvement. (author). 4 tabs., 8 figs., 11 refs

  6. Development of a test rig and its application for validation and reliability testing of safety-critical software

    International Nuclear Information System (INIS)

    Thai, N.D.; McDonald, A.M.

    1995-01-01

    This paper describes a versatile test rig developed by AECL for functional testing of safety-critical software used in the process trip computers of the Wolsong CANDU stations. The description covers the hardware and software aspects of the test rig, the test language and its interpreter, and other major testing software utilities such as the test oracle, sampler and profiler. The paper also discusses the application of the rig in the final stages of testing of the process trip computer software, namely validation and reliability tests. It shows how random test cases are generated, test scripts prepared and automatically run on the test rig. The versatility of the rig is further demonstrated in other types of testing such as sub-system tests, verification of the test oracle, testing of newly-developed test script, self-test and calibration. (author). 5 tabs., 10 figs

  7. Software test and validation of wireless sensor nodes used in nuclear power plant

    International Nuclear Information System (INIS)

    Deng Changjian; Chen Dongyi; Zhang Heng

    2015-01-01

    The software test and validation of wireless sensor nodes is one of the key approaches to improve or guarantee the reliability of wireless network application in nuclear power plants (NPPs). At first, to validate the software test, some concepts are defined quantitatively, for example the robustness of software, the reliability of software, and the security of software. Then the development tools and simulators of discrete event drive operating system are compared, in order to present robustness, reliability and security of software test approach based on input-output function. Some simple preliminary test results are given to show that different development software can obtain almost same measurement and communication results although the software of special application may be different than normal application. (author)

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  9. Advanced burner test reactor preconceptual design report.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y. I.; Finck, P. J.; Grandy, C.; Cahalan, J.; Deitrich, L.; Dunn, F.; Fallin, D.; Farmer, M.; Fanning, T.; Kim, T.; Krajtl, L.; Lomperski, S.; Moisseytsev, A.; Momozaki, Y.; Sienicki, J.; Park, Y.; Tang, Y.; Reed, C.; Tzanos, C; Wiedmeyer, S.; Yang, W.; Chikazawa, Y.; JAEA

    2008-12-16

    The goals of the Global Nuclear Energy Partnership (GNEP) are to expand the use of nuclear energy to meet increasing global energy demand, to address nuclear waste management concerns and to promote non-proliferation. Implementation of the GNEP requires development and demonstration of three major technologies: (1) Light water reactor (LWR) spent fuel separations technologies that will recover transuranics to be recycled for fuel but not separate plutonium from other transuranics, thereby providing proliferation-resistance; (2) Advanced Burner Reactors (ABRs) based on a fast spectrum that transmute the recycled transuranics to produce energy while also reducing the long term radiotoxicity and decay heat loading in the repository; and (3) Fast reactor fuel recycling technologies to recover and refabricate the transuranics for repeated recycling in the fast reactor system. The primary mission of the ABR Program is to demonstrate the transmutation of transuranics recovered from the LWR spent fuel, and hence the benefits of the fuel cycle closure to nuclear waste management. The transmutation, or burning of the transuranics is accomplished by fissioning and this is most effectively done in a fast spectrum. In the thermal spectrum of commercial LWRs, some transuranics capture neutrons and become even heavier transuranics rather than being fissioned. Even with repeated recycling, only about 30% can be transmuted, which is an intrinsic limitation of all thermal spectrum reactors. Only in a fast spectrum can all transuranics be effectively fissioned to eliminate their long-term radiotoxicity and decay heat. The Advanced Burner Test Reactor (ABTR) is the first step in demonstrating the transmutation technologies. It directly supports development of a prototype full-scale Advanced Burner Reactor, which would be followed by commercial deployment of ABRs. The primary objectives of the ABTR are: (1) To demonstrate reactor-based transmutation of transuranics as part of an

  10. A Web-Based Learning System for Software Test Professionals

    Science.gov (United States)

    Wang, Minhong; Jia, Haiyang; Sugumaran, V.; Ran, Weijia; Liao, Jian

    2011-01-01

    Fierce competition, globalization, and technology innovation have forced software companies to search for new ways to improve competitive advantage. Web-based learning is increasingly being used by software companies as an emergent approach for enhancing the skills of knowledge workers. However, the current practice of Web-based learning is…

  11. Testing tool for software concerning nuclear power plant safety

    International Nuclear Information System (INIS)

    Boulc'h, J.; Le Meur, M.; Collart, J.M.; Segalard, J.; Uberschlag, J.

    1984-11-01

    In the present case, softwares to be analyzed are all written in assembler language. This paper presents the study and the realization of a tool to analyze softwares which have an important role for nuclear reactor protection and sauvegarde: principles of the tool design, working principle, realization and evolution of dynamic analyze tool [fr

  12. Some Observations on Object Oriented Software Testing Effort ...

    African Journals Online (AJOL)

    Estimation and measurement is an important field of software engineering. There are several factors available in real word that can be measured directly, but few or no software engineering attributes or tasks are so simple that can measure directly. Due to development of OO paradigm it is also become cumbersome to ...

  13. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  14. Investigating Advances in the Acquisition of Secure Systems Based on Open Architecture, Open Source Software, and Software Product Lines

    Science.gov (United States)

    2012-01-27

    software system acquisition within the DoD, whether focusing on SPLs ( Bergey & Jones, 2010; Guertin & Clements, 2010), or on how to improve software system...Engineering (SEKE2011), Miami, FL. Bergey , J., & Jones, L. (2010). Exploring acquisition strategies for adopting a software product line approach. In

  15. Advanced test reactor testing experience-past, present and future

    International Nuclear Information System (INIS)

    Marshall, Frances M.

    2006-01-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is one of the world's premier test reactors for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The physical configuration of the ATR, a 4-leaf clover shape, allows the reactor to be operated at different power levels in the corner 'lobes' to allow for different testing conditions for multiple simultaneous experiments. The combination of high flux (maximum thermal neutron fluxes of 1E15 neutrons per square centimeter per second and maximum fast [E>1.0 MeV] neutron fluxes of 5E14 neutrons per square centimeter per second) and large test volumes (up to 122 cm long and 12.7 cm diameter) provide unique testing opportunities. The current experiments in the ATR are for a variety of test sponsors - US government, foreign governments, private researchers, and commercial companies needing neutron irradiation services. There are three basic types of test configurations in the ATR. The simplest configuration is the sealed static capsule, which places the capsule in direct contact with the primary coolant. The next level of experiment complexity is an instrumented lead experiment, which allows for active control of experiment conditions during the irradiation. The most complex experiment is the pressurized water loop, in which the test sample can be subjected to the exact environment of a pressurized water reactor. For future research, some ATR modifications and enhancements are currently planned. This paper provides more details on some of the ATR capabilities, key design features, experiments, and future plans

  16. Balancing technical and regulatory concerns related to testing and control of performance assessment software

    International Nuclear Information System (INIS)

    Seitz, R.R.; Matthews, S.D.; Kostelnik, K.M.

    1990-01-01

    What activities are required to assure that a performance assessment (PA) computer code operates as it is intended? Answers to this question will vary depending on the individual's area of expertise. Different perspectives on testing and control of PA software are discussed based on interpretations of the testing and control process associated with the different involved parties. This discussion leads into the presentation of a general approach to software testing and control that address regulatory requirements. Finally, the need for balance between regulatory and scientific concerns is illustrated through lessons learned in previous implementations of software testing and control programs. Configuration control and software testing are required to provide assurance that a computer code performs as intended. Configuration control provides traceability and reproducibility of results produced with PA software and provides a system to assure that users have access to the current version of the software. Software testing is conducted to assure that the computer code has been written properly, solution techniques have been properly implemented, and the software is capable of representing the behavior of the specific system to be modeled. Comprehensive software testing includes: software analysis, verification testing, benchmark testing, and site-specific calibration/validation testing

  17. Software Sub-system in Loading Automatic Test System for the Measurement of Power Line Filters

    Directory of Open Access Journals (Sweden)

    Yu Bo

    2017-01-01

    Full Text Available The loading automatic test system for measurement of power line filters are in urgent demand. So the software sub-system of the whole test system was proposed. Methods: structured the test system based on the virtual instrument framework, which consisted of lower and up computer and adopted the top down approach of design to perform the system and its modules, according to the measurement principle of the test system. Results: The software sub-system including human machine interface, data analysis and process software, expert system, communication software, control software in lower computer, etc. had been designed. Furthermore, it had been integrated into the entire test system. Conclusion: This sub-system provided a fiendly software platform for the whole test system, and had many advantages such as strong functions, high performances, low prices. It not only raises the test efficiency of EMI filters, but also renders some creativities.

  18. The Utility of Free Software for Gravity and Magnetic Advanced Data Processing

    Science.gov (United States)

    Grandis, Hendra; Dahrin, Darharta

    2017-04-01

    The lack of computational tools, i.e. software, often hinders the proper teaching and application of geophysical data processing in academic institutions in Indonesia. Although there are academic licensing options for commercial software, such options are still way beyond the financial capability of some academic institutions. Academic community members (both lecturers and students) are supposed to be creative and resourceful to overcome such situation. Therefore, capability for writing computer programs or codes is a necessity. However, there are also many computer programs and even software that are freely available on the internet. Generally, the utility of the freely distributed software is limited for demonstration only or for visualizing and exchanging data. The paper discusses the utility of Geosoft’s Oasis Montaj Viewer along with USGS GX programs that are available for free. Useful gravity and magnetic advanced data processing (i.e. gradient calculation, spectral analysis etc.) can be performed “correctly” without any approximation that sometimes leads to dubious results and interpretation.

  19. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  20. Software Unit Testing during the Development of Digital Reactor Protection System of HTR-PM

    International Nuclear Information System (INIS)

    Guo Chao; Xiong Huasheng; Li Duo; Zhou Shuqiao; Li Jianghai

    2014-01-01

    Reactor Protection System (RPS) of High Temperature Gas-Cooled Reactor - Pebble bed Module (HTR-PM) is the first digital RPS designed and to be operated in the Nuclear Power Plant (NPP) of China, and its development process has receives a lot of concerns around the world. As a 1E-level safety system, the RPS has to be designed and developed following a series of nuclear laws and technical disciplines including software verification and validation (software V&V). Software V&V process demonstrates whether all stages during the software development are performed correctly, completely, accurately, and consistently, and the results of each stage are testable. Software testing is one of the most significant and time-consuming effort during software V&V. In this paper, we give a comprehensive introduction to the software unit testing during the development of RPS in HTR-PM. We first introduce the objective of the testing for our project in the aspects of static testing, black-box testing, and white-box testing. Then the testing techniques, including static testing and dynamic testing, are explained, and the testing strategy we employed is also introduced. We then introduce the principles of three kinds of coverage criteria we used including statement coverage, branch coverage, and the modified condition/decision coverage. As a 1E-level safety software, testing coverage needs to be up to 100% mandatorily. Then we talk the details of safety software testing during software development in HTR-PM, including the organization, methods and tools, testing stages, and testing report. The test result and experiences are shared and finally we draw a conclusion for the unit testing process. The introduction of this paper can contribute to improve the process of unit testing and software development for other digital instrumentation and control systems in NPPs. (author)

  1. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    International Nuclear Information System (INIS)

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. 1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. 2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. 3) A software tool, is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display

  2. Roles for software technologies in advancing research and theory in educational psychology.

    Science.gov (United States)

    Hadwin, Allyson F; Winne, Philip H; Nesbit, John C

    2005-03-01

    While reviews abound on theoretical topics in educational psychology, it is rare that we examine our field's instrumentation development, and what effects this has on educational psychology's evolution. To repair this gap, this paper investigates and reveals the implications of software technologies for researching and theorizing about core issues in educational psychology. From a set of approximately 1,500 articles published between 1999 and 2004, we sampled illustrative studies and organized them into four broad themes: (a) innovative ways to operationalize variables, (b) the changing nature of instructional interventions, (c) new fields of research in educational psychology, and (d) new constructs to be examined. In each area, we identify novel uses of these technologies and suggest how they may advance, and, in some instances, reshape theory and methodology. Overall, we demonstrate that software technologies hold significant potential to elaborate research in the field.

  3. Advanced Transport Operating System (ATOPS) color displays software description: MicroVAX system

    Science.gov (United States)

    Slominski, Christopher J.; Plyler, Valerie E.; Dickson, Richard W.

    1992-01-01

    This document describes the software created for the Display MicroVAX computer used for the Advanced Transport Operating Systems (ATOPS) project on the Transport Systems Research Vehicle (TSRV). The software delivery of February 27, 1991, known as the 'baseline display system', is the one described in this document. Throughout this publication, module descriptions are presented in a standardized format which contains module purpose, calling sequence, detailed description, and global references. The global references section includes subroutines, functions, and common variables referenced by a particular module. The system described supports the Research Flight Deck (RFD) of the TSRV. The RFD contains eight Cathode Ray Tubes (CRTs) which depict a Primary Flight Display, Navigation Display, System Warning Display, Takeoff Performance Monitoring System Display, and Engine Display.

  4. Advanced Transport Operating System (ATOPS) color displays software description microprocessor system

    Science.gov (United States)

    Slominski, Christopher J.; Plyler, Valerie E.; Dickson, Richard W.

    1992-01-01

    This document describes the software created for the Sperry Microprocessor Color Display System used for the Advanced Transport Operating Systems (ATOPS) project on the Transport Systems Research Vehicle (TSRV). The software delivery known as the 'baseline display system', is the one described in this document. Throughout this publication, module descriptions are presented in a standardized format which contains module purpose, calling sequence, detailed description, and global references. The global reference section includes procedures and common variables referenced by a particular module. The system described supports the Research Flight Deck (RFD) of the TSRV. The RFD contains eight cathode ray tubes (CRTs) which depict a Primary Flight Display, Navigation Display, System Warning Display, Takeoff Performance Monitoring System Display, and Engine Display.

  5. Vacuum system for Advanced Test Accelerator

    International Nuclear Information System (INIS)

    Denhoy, B.S.

    1981-01-01

    The Advanced Test Accelerator (ATA) is a pulsed linear electron beam accelerator designed to study charged particle beam propagation. ATA is designed to produce a 10,000 amp 50 MeV, 70 ns electron beam. The electron beam acceleration is accomplished in ferrite loaded cells. Each cell is capable of maintaining a 70 ns 250 kV voltage pulse across a 1 inch gap. The electron beam is contained in a 5 inch diameter, 300 foot long tube. Cryopumps turbomolecular pumps, and mechanical pumps are used to maintain a base pressure of 2 x 10 -6 torr in the beam tube. The accelerator will be installed in an underground tunnel. Due to the radiation environment in the tunnel, the controlling and monitoring of the vacuum equipment, pressures and temperatures will be done from the control room through a computer interface. This paper describes the vacuum system design, the type of vacuum pumps specified, the reasons behind the selection of the pumps and the techniques used for computer interfacing

  6. Vacuum system for Advanced Test Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Denhoy, B.S.

    1981-09-03

    The Advanced Test Accelerator (ATA) is a pulsed linear electron beam accelerator designed to study charged particle beam propagation. ATA is designed to produce a 10,000 amp 50 MeV, 70 ns electron beam. The electron beam acceleration is accomplished in ferrite loaded cells. Each cell is capable of maintaining a 70 ns 250 kV voltage pulse across a 1 inch gap. The electron beam is contained in a 5 inch diameter, 300 foot long tube. Cryopumps turbomolecular pumps, and mechanical pumps are used to maintain a base pressure of 2 x 10/sup -6/ torr in the beam tube. The accelerator will be installed in an underground tunnel. Due to the radiation environment in the tunnel, the controlling and monitoring of the vacuum equipment, pressures and temperatures will be done from the control room through a computer interface. This paper describes the vacuum system design, the type of vacuum pumps specified, the reasons behind the selection of the pumps and the techniques used for computer interfacing.

  7. A Method to Select Test Input Cases for Safety-critical Software

    International Nuclear Information System (INIS)

    Kim, Heeeun; Kang, Hyungook; Son, Hanseong

    2013-01-01

    This paper proposes a new testing methodology for effective and realistic quantification of RPS software failure probability. Software failure probability quantification is important factor in digital system safety assessment. In this study, the method for software test case generation is briefly described. The test cases generated by this method reflect the characteristics of safety-critical software and past inputs. Furthermore, the number of test cases can be reduced, but it is possible to perform exhaustive test. Aspect of software also can be reflected as failure data, so the final failure data can include the failure of software itself and external influences. Software reliability is generally accepted as the key factor in software quality since it quantifies software failures which can make a powerful system inoperative. In the KNITS (Korea Nuclear Instrumentation and Control Systems) project, the software for the fully digitalized reactor protection system (RPS) was developed under a strict procedure including unit testing and coverage measurement. Black box testing is one type of Verification and validation (V and V), in which given input values are entered and the resulting output values are compared against the expected output values. Programmable logic controllers (PLCs) were used in implementing critical systems and function block diagram (FBD) is a commonly used implementation language for PLC

  8. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  9. Controlatron Neutron Tube Test Suite Software Manual - Operation Manual (V2.2)

    CERN Document Server

    Noel, W P; Hertrich, R J; Martinez, M L; Wallace, D L

    2002-01-01

    The Controlatron Software Suite is a custom built application to perform automated testing of Controlatron neutron tubes. The software package was designed to allowing users to design tests and to run a series of test suites on a tube. The data is output to ASCII files of a pre-defined format for data analysis and viewing with the Controlatron Data Viewer Application. This manual discusses the operation of the Controlatron Test Suite Software and a brief discussion of state machine theory, as state machine is the functional basis of the software.

  10. CLAIRE, an event-driven simulation tool for testing software

    International Nuclear Information System (INIS)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc'h, J.

    1994-06-01

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.)

  11. ADVANCED OXIDATION: OXALATE DECOMPOSITION TESTING WITH OZONE

    Energy Technology Data Exchange (ETDEWEB)

    Ketusky, E.; Subramanian, K.

    2012-02-29

    At the Savannah River Site (SRS), oxalic acid is currently considered the preferred agent for chemically cleaning the large underground Liquid Radioactive Waste Tanks. It is applied only in the final stages of emptying a tank when generally less than 5,000 kg of waste solids remain, and slurrying based removal methods are no-longer effective. The use of oxalic acid is preferred because of its combined dissolution and chelating properties, as well as the fact that corrosion to the carbon steel tank walls can be controlled. Although oxalic acid is the preferred agent, there are significant potential downstream impacts. Impacts include: (1) Degraded evaporator operation; (2) Resultant oxalate precipitates taking away critically needed operating volume; and (3) Eventual creation of significant volumes of additional feed to salt processing. As an alternative to dealing with the downstream impacts, oxalate decomposition using variations of ozone based Advanced Oxidation Process (AOP) were investigated. In general AOPs use ozone or peroxide and a catalyst to create hydroxyl radicals. Hydroxyl radicals have among the highest oxidation potentials, and are commonly used to decompose organics. Although oxalate is considered among the most difficult organic to decompose, the ability of hydroxyl radicals to decompose oxalate is considered to be well demonstrated. In addition, as AOPs are considered to be 'green' their use enables any net chemical additions to the waste to be minimized. In order to test the ability to decompose the oxalate and determine the decomposition rates, a test rig was designed, where 10 vol% ozone would be educted into a spent oxalic acid decomposition loop, with the loop maintained at 70 C and recirculated at 40L/min. Each of the spent oxalic acid streams would be created from three oxalic acid strikes of an F-area simulant (i.e., Purex = high Fe/Al concentration) and H-area simulant (i.e., H area modified Purex = high Al/Fe concentration

  12. Advanced Oxidation: Oxalate Decomposition Testing With Ozone

    International Nuclear Information System (INIS)

    Ketusky, E.; Subramanian, K.

    2012-01-01

    At the Savannah River Site (SRS), oxalic acid is currently considered the preferred agent for chemically cleaning the large underground Liquid Radioactive Waste Tanks. It is applied only in the final stages of emptying a tank when generally less than 5,000 kg of waste solids remain, and slurrying based removal methods are no-longer effective. The use of oxalic acid is preferred because of its combined dissolution and chelating properties, as well as the fact that corrosion to the carbon steel tank walls can be controlled. Although oxalic acid is the preferred agent, there are significant potential downstream impacts. Impacts include: (1) Degraded evaporator operation; (2) Resultant oxalate precipitates taking away critically needed operating volume; and (3) Eventual creation of significant volumes of additional feed to salt processing. As an alternative to dealing with the downstream impacts, oxalate decomposition using variations of ozone based Advanced Oxidation Process (AOP) were investigated. In general AOPs use ozone or peroxide and a catalyst to create hydroxyl radicals. Hydroxyl radicals have among the highest oxidation potentials, and are commonly used to decompose organics. Although oxalate is considered among the most difficult organic to decompose, the ability of hydroxyl radicals to decompose oxalate is considered to be well demonstrated. In addition, as AOPs are considered to be 'green' their use enables any net chemical additions to the waste to be minimized. In order to test the ability to decompose the oxalate and determine the decomposition rates, a test rig was designed, where 10 vol% ozone would be educted into a spent oxalic acid decomposition loop, with the loop maintained at 70 C and recirculated at 40L/min. Each of the spent oxalic acid streams would be created from three oxalic acid strikes of an F-area simulant (i.e., Purex = high Fe/Al concentration) and H-area simulant (i.e., H area modified Purex = high Al/Fe concentration) after nearing

  13. Reporting of Software Product-Testing Stirs Debate

    Science.gov (United States)

    Trotter, Andrew

    2006-01-01

    With the results of a forthcoming federal study of educational software still under wraps, questions are arising about how it has been conducted--particularly the government's decision not to disclose individual performance results for the 15 computerized curriculum packages being studied. The companies involved will receive results for their own…

  14. Using Fuzzy Logic Techniques for Assertion-Based Software Testing Metrics.

    Science.gov (United States)

    Alakeel, Ali M

    2015-01-01

    Software testing is a very labor intensive and costly task. Therefore, many software testing techniques to automate the process of software testing have been reported in the literature. Assertion-Based automated software testing has been shown to be effective in detecting program faults as compared to traditional black-box and white-box software testing methods. However, the applicability of this approach in the presence of large numbers of assertions may be very costly. Therefore, software developers need assistance while making decision to apply Assertion-Based testing in order for them to get the benefits of this approach at an acceptable level of costs. In this paper, we present an Assertion-Based testing metrics technique that is based on fuzzy logic. The main goal of the proposed technique is to enhance the performance of Assertion-Based software testing in the presence of large numbers of assertions. To evaluate the proposed technique, an experimental study was performed in which the proposed technique is applied on programs with assertions. The result of this experiment shows that the effectiveness and performance of Assertion-Based software testing have improved when applying the proposed testing metrics technique.

  15. Investigation of Classification and Design Requirements for Digital Software for Advanced Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee Young; Jung, H. S.; Ryu, J. S.; Park, C

    2005-06-15

    As the digital technology is being developed drastically, it is being applied to various industrial instrumentation and control (I and C) fields. In the nuclear power plants, I and C systems are also being installed by digital systems replacing their corresponding analog systems installed previously. There had been I and C systems constructed by analog technology especially for the reactor protection system in the research reactor HANARO. Parallel to the pace of the current trend for digital technology, it is desirable that all I and C systems including the safety critical and non-safety systems in an advanced research reactor is to be installed based on the computer based system. There are many attractable features in using digital systems against existing analog systems in that the digital system has a superior performance for a function and it is more flexible than the analog system. And any fruit gained from the newly developed digital technology can be easily incorporated into the existing digital system and hence, the performance improvement of a computer based system can be implemented conveniently and promptly. Moreover, the capability of high integrity in electronic circuits reduces the electronic components needed to construct the processing device and makes the electronic board simple, and this fact reveals that the hardware failure itself are unlikely to occur in the electronic device other than some electric problems. Balanced the fact mentioned above are the roles and related issues of the software loaded on the digital integrated hardware. Some defects in the course of software development might induce a severe damage on the computer system and plant systems and therefore it is obvious that comprehensive and deep considerations are to be placed on the development of the software in the design of I and C system for use in an advanced research reactor. The work investigates the domestic and international standards on the classifications of digital

  16. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    Science.gov (United States)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  17. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    Science.gov (United States)

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  18. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  19. Software Process Improvement: Continuous Integration and Testing for Web Application Development

    OpenAIRE

    MUHONEN, MATIAS

    2009-01-01

    Software testing is not the only approach to improve software quality but, perhaps, one of the most important ones. This is because testing gives us the confidence that the software will work as it should in its intended environment. In this thesis the author introduces a Software Process Improvement (SPI) technique for conducting automated web application security testing. The continuous integration process provides rapid and automatic feedback on the security of the web applications under d...

  20. WRAP module 1 data management system software test report

    International Nuclear Information System (INIS)

    Weidert, J.R.

    1997-01-01

    This document summarizes the test result information for the Data Management System (DMS). Appendix A contains test result information for all Functional Test cases and Appendix B contains the results for all the Performance Test cases

  1. The software testing of PPS for shin Ulchin nuclear power plant units 1 and 2

    International Nuclear Information System (INIS)

    Kang, Dong Pa; Park, Cheol Lak; Cho, Chang Hui; Sohn, Se Do; Baek, Seung Min

    2012-01-01

    The testing of software (S/W) is the process of analyzing a software item to detect the differences between existing and required conditions to evaluate the features of the software items. This paper introduces the S/W testing of Plant Protection System (PPS), as a safety system which actuate Reactor Trip (RT) and Engineered Safety Features (ESF) for Shin Ulchin Nuclear Power Plant Units 1 and 2 (SUN 1 and 2)

  2. An evaluation of software testing metrics for NASA's mission control center

    Science.gov (United States)

    Stark, George E.; Durst, Robert C.; Pelnik, Tammy M.

    1991-01-01

    Software metrics are used to evaluate the software development process and the quality of the resulting product. Five metrics were used during the testing phase of the Shuttle Mission Control Center Upgrade at the NASA Johnson Space Center. All but one metric provided useful information. Based on the experience, it is recommended that metrics be used during the test phase of software development and additional candidate metrics are proposed for further study.

  3. Perfil Operacional – Estratégia Essencial ao Teste de Software

    OpenAIRE

    Antonio Mendes Silva Filho

    2013-01-01

    Teste de software é uma das atividades do processo de desenvolvimento de sistema de software que visa executar programa de maneira sistemática com o objetivo de encontrar falhas. Minimizar a ocorrência de falhas em software é essencial e como consequência, tem um aumento na confiabilidade de software. Não há exemplo melhor de liberdade e ser criativo que uma criança. Nesse sentido, este artigo apresenta uma prática de engenharia de software de considerar o perfil operacional de um sistema de ...

  4. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    Science.gov (United States)

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  5. Racial Differences in Mathematics Test Scores for Advanced Mathematics Students

    Science.gov (United States)

    Minor, Elizabeth Covay

    2016-01-01

    Research on achievement gaps has found that achievement gaps are larger for students who take advanced mathematics courses compared to students who do not. Focusing on the advanced mathematics student achievement gap, this study found that African American advanced mathematics students have significantly lower test scores and are less likely to be…

  6. The Design of Software for Three-Phase Induction Motor Test System

    Science.gov (United States)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  7. Determination of the number of software tests using probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, H. K.; Seong, T. Y.; Lee, K. Y.

    2000-01-01

    The broader usage of digital equipment in nuclear power plants gives rise to the safety problems of software. The field test should be performed before the software is used in critical applications because it is well known that software shows non-linear response when it is applied to different target systems in different environment. In the case of safety-critical applications, the result of tests contains usually zero failure case and the satisfiable number of tests is hard to be determined. In this paper, we suggests the method to determine the number of software tests without failure using the probabilistic safety assessment. From the result of the probabilistic safety assessment on total system, the desirable unavailability of software is calculated and the number of tests is determined

  8. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  9. Advanced fighter technology integration (AFTI)/F-16 Automated Maneuvering Attack System final flight test results

    Science.gov (United States)

    Dowden, Donald J.; Bessette, Denis E.

    1987-01-01

    The AFTI F-16 Automated Maneuvering Attack System has undergone developmental and demonstration flight testing over a total of 347.3 flying hours in 237 sorties. The emphasis of this phase of the flight test program was on the development of automated guidance and control systems for air-to-air and air-to-ground weapons delivery, using a digital flight control system, dual avionics multiplex buses, an advanced FLIR sensor with laser ranger, integrated flight/fire-control software, advanced cockpit display and controls, and modified core Multinational Stage Improvement Program avionics.

  10. Advance reservation access control using software-defined networking and tokens

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Joaquin; Jung, Eun-Sung; Kettimuthu, Rajkumar; Rao, Nageswara S. V.; Foster, Ian T.; Clark, Russ; Owen, Henry

    2018-02-01

    Advance reservation systems allow users to reserve dedicated bandwidth connection resources from advanced high-speed networks. A common use case for such systems is data transfers in distributed science environments in which a user wants exclusive access to the reservation. However, current advance network reservation methods cannot ensure exclusive access of a network reservation to the specific flow for which the user made the reservation. We present here a novel network architecture that addresses this limitation and ensures that a reservation is used only by the intended flow. We achieve this by leveraging software-defined networking (SDN) and token-based authorization. We use SDN to orchestrate and automate the reservation of networking resources, end-to-end and across multiple administrative domains, and tokens to create a strong binding between the user or application that requested the reservation and the flows provisioned by SDN. We conducted experiments on the ESNet 100G SDN testbed, and demonstrated that our system effectively protects authorized flows from competing traffic in the network. (C) 2017 Elsevier B.V. All rights reserved.

  11. Software verification and validation methodology for advanced digital reactor protection system using diverse dual processors to prevent common mode failure

    International Nuclear Information System (INIS)

    Son, Ki Chang; Shin, Hyun Kook; Lee, Nam Hoon; Baek, Seung Min; Kim, Hang Bae

    2001-01-01

    The Advanced Digital Reactor Protection System (ADRPS) with diverse dual processors is being developed by the National Research Lab of KOPEC for ADRPS development. One of the ADRPS goals is to develop digital Plant Protection System (PPS) free of Common Mode Failure (CMF). To prevent CMF, the principle of diversity is applied to both hardware design and software design. For the hardware diversity, two different types of CPUs are used for Bistable Processor and Local Coincidence Logic Processor. The VME based Single Board Computers (SBC) are used for the CPU hardware platforms. The QNX Operating System (OS) and the VxWorks OS are used for software diversity. Rigorous Software Verification and Validation (V and V) is also required to prevent CMF. In this paper, software V and V methodology for the ADRPS is described to enhance the ADRPS software reliability and to assure high quality of the ADRPS software

  12. Software Test Description (STD) for the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides)

    National Research Council Canada - National Science Library

    Posey, Pamela

    2002-01-01

    The purpose of this Software Test Description (STD) is to establish formal test cases to be used by personnel tasked with the installation and verification of the Globally Relocatable Navy Tide/Atmospheric Modeling System (PCTides...

  13. Features of the Test Automation Software-Hardware Data Protection Tools

    Directory of Open Access Journals (Sweden)

    Tatiana Mikhailovna Borisova

    2013-06-01

    Full Text Available The author discusses the various types of testing in terms of automation of this process, the advantages and disadvantages of automated testing, as well as ways and especially its application to software-hardware data protection.

  14. Application of a path sensitizing method on automated generation of test specifications for control software

    International Nuclear Information System (INIS)

    Morimoto, Yuuichi; Fukuda, Mitsuko

    1995-01-01

    An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

  15. Command and Data Handling Flight Software test framework: A Radiation Belt Storm Probes practice

    Science.gov (United States)

    Hill, T. A.; Reid, W. M.; Wortman, K. A.

    During the Radiation Belt Storm Probes (RBSP) mission, a test framework was developed by the Embedded Applications Group in the Space Department at the Johns Hopkins Applied Physics Laboratory (APL). The test framework is implemented for verification of the Command and Data Handling (C& DH) Flight Software. The RBSP C& DH Flight Software consists of applications developed for use with Goddard Space Flight Center's core Flight Executive (cFE) architecture. The test framework's initial concept originated with tests developed for verification of the Autonomy rules that execute with the Autonomy Engine application of the RBSP C& DH Flight Software. The test framework was adopted and expanded for system and requirements verification of the RBSP C& DH Flight Software. During the evolution of the RBSP C& DH Flight Software test framework design, a set of script conventions and a script library were developed. The script conventions and library eased integration of system and requirements verification tests into a comprehensive automated test suite. The comprehensive test suite is currently being used to verify releases of the RBSP C& DH Flight Software. In addition to providing the details and benefits of the test framework, the discussion will include several lessons learned throughout the verification process of RBSP C& DH Flight Software. Our next mission, Solar Probe Plus (SPP), will use the cFE architecture for the C& DH Flight Software. SPP also plans to use the same ground system as RBSP. Many of the RBSP C& DH Flight Software applications are reusable on the SPP mission, therefore there is potential for test design and test framework reuse for system and requirements verification.

  16. Development and Testing of Automatically Generated ACS Flight Software for the MAP Spacecraft

    Science.gov (United States)

    ODonnell, James R., Jr.; McComas, David C.; Andrews, Stephen F.

    1998-01-01

    By integrating the attitude determination and control system (ACS) analysis and design, flight software development, and flight software testing processes, it is possible to improve the overall spacecraft development cycle, as well as allow for more thorough software testing. One of the ways to achieve this integration is to use code-generation tools to automatically generate components of the ACS flight software directly from a high-fidelity (HiFi) simulation. In the development of the Microwave Anisotropy Probe (MAP) spacecraft, currently underway at the NASA Goddard Space Flight Center, approximately 1/3 of the ACS flight software was automatically generated. In this paper, we will examine each phase of the ACS subsystem and flight software design life cycle: analysis, design, and testing. In the analysis phase, we scoped how much software we would automatically generate and created the initial interface. The design phase included parallel development of the HiFi simulation and the hand-coded flight software components. Everything came together in the test phase, in which the flight software was tested, using results from the HiFi simulation as one of the bases of comparison for testing. Because parts of the spacecraft HiFi simulation were converted into flight software, more care needed to be put into its development and configuration control to support both the HiFi simulation and flight software. The components of the HiFi simulation from which code was generated needed to be designed based on the fact that they would become flight software. This process involved such considerations as protecting against mathematical exceptions, using acceptable module and parameter naming conventions, and using an input/output interface compatible with the rest of the flight software. Maintaining good configuration control was an issue for the HiFi simulation and the flight software, and a way to track the two systems was devised. Finally, an integrated test approach was

  17. Naturalistic Usability Testing of Inpatient Medication Reconciliation Software.

    Science.gov (United States)

    Lesselroth, Blake; Adams, Kathleen; Tallett, Stephanie; Ong, Lindsay; Bliss, Susan; Ragland, Scott; Tran, Hanna; Church, Victoria

    2017-01-01

    Medication history errors are common at admission, but can be mitigated through the implementation of medication reconciliation (MR). We designed multi-media software to assist clinicians with collection of an admission history. This manuscript describes a naturalistic usability study conducted on the hospital wards. Our goals were to 1) estimate the impact of our workflow upon departmental productivity and 2) determine the ability of our software to detect discrepancies. We furnished clinical pharmacists with our application on a tablet PC and asked them to collect a bedside history. We used 1) time-motion analysis to estimate cycle-time and 2) chart reviews to estimate error detection rates. Our intervention detected an average of 7.7 discrepancies per admission (11.7 per pharmacy-shift). A panel rated 67% of these discrepancies as 'high' or 'very high' risk. The cycle-time per admission was slightly longer than usual care processes (20.5 min vs. 17.9 min), but included a bedside interview. In general, pharmacists agreed that the technology improved the completeness and accuracy of a medication history. However, workflow leveling strategies are important to implementing a durable process. In conclusion, a pharmacist-mediated, patient-centered technology holds promise for improving the quality of MR and overall clinical performance.

  18. Intelligent software system for the advanced control room of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Heung; Choi, Seong Soo; Park, Jin Kyun; Heo, Gyung Young [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Kim, Han Gon [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    The intelligent software system for nuclear power plants (NPPs) has been conceptually designed in this study. Its design goals are to operate NPPs in an improved manner and to support operators` cognitive takes. It consists of six major modules such as {sup I}nformation Processing,{sup {sup A}}larm Processing,{sup {sup P}}rocedure Tracking,{sup {sup P}}erformance Diagnosis,{sup a}nd {sup E}vent Diagnosis{sup m}odules for operators and {sup M}alfunction Diagnosis{sup m}odule for maintenance personnel. Most of the modules have been developed for several years and the others are under development. After the completion of development, they will be combined into one system that would be main parts of advanced control rooms in NPPs. 5 refs., 4 figs. (Author)

  19. Proton Testing of Advanced Stellar Compass Digital Processing Unit

    DEFF Research Database (Denmark)

    Thuesen, Gøsta; Denver, Troelz; Jørgensen, Finn E

    1999-01-01

    The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland.......The Advanced Stellar Compass Digital Processing Unit was radiation tested with 300 MeV protons at Proton Irradiation Facility (PIF), Paul Scherrer Institute, Switzerland....

  20. Junction temperature estimation for an advanced active power cycling test

    DEFF Research Database (Denmark)

    Choi, Uimin; Blaabjerg, Frede; Jørgensen, S.

    2015-01-01

    estimation method using on-state VCE for an advanced active power cycling test is proposed. The concept of the advanced power cycling test is explained first. Afterwards the junction temperature estimation method using on-state VCE and current is presented. Further, the method to improve the accuracy...

  1. Software development for the evaluation of the ergonomic compatibility on the selection of advanced manufacturing technology.

    Science.gov (United States)

    Maldonado-Macías, A; Reyes, R; Guillen, L; García, J

    2012-01-01

    Advanced Manufacturing Technology (AMT) is one of the most relevant resources that companies have to achieve competitiveness and best performance. The selection of AMT is a complex problem which involves significant amount of information and uncertainty when multiple aspects must be taken into consideration. Actual models for the selection of AMT are found scarce of the Human Factors and Ergonomics perspective which can lead to a more complete and reliable decision. This paper presents the development of software that enhances the application of an Ergonomic Compatibility Evaluation Model that supports decision making processes taking into consideration ergonomic attributes of designs. Ergonomic Compatibility is a construct used in this model and it is mainly based in the concept of human-artifact compatibility on human compatible systems. Also, an Axiomatic Design approach by the use of the Information Axiom was evolved under a fuzzy environment to obtain the Ergonomic Incompatibility Content. The extension of this axiom for the evaluation of ergonomic compatibility requirements was the theoretical framework of this research. An incremental methodology of four stages was used to design and develop the software that enables to compare AMT alternatives by the evaluation of Ergonomic Compatibility Attributes.

  2. Software for Estimating Costs of Testing Rocket Engines

    Science.gov (United States)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  3. Advanced Laboratory Setup for Testing Offshore Foundations

    DEFF Research Database (Denmark)

    Nielsen, Søren Dam; Ibsen, Lars Bo; Nielsen, Benjaminn Nordahl

    2016-01-01

    This paper describes a test setup for testing small-scale offshore foundations under realistic conditions of high pore-water pressure and high impact loads. The actuator, used for loading has enough capacity to apply sufficient force and displacement to achieve both drained and undrained failure...... modes for small-scale offshore foundations. Results from trial tests on two small-scale bucket foundations, subjected to transient or cyclic loading, are presented. Tests showed that cavitation limits the undrained bearing capacity. Hence, a high pore-water pressure is important for simulating offshore...

  4. Advance features in the SPAN and SPAN/XRF gamma ray and X ray spectrum analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1998-01-01

    This paper describes the advanced techniques, integral peak background, experimental peak shape and complex peak shape, which have been used successfully in the software packages SPAN and SPAN/XRF to process gamma ray and X ray spectra from HPGe and Si(Li) detector. Main features of SPAN and SPAN/XRF are also described. The software runs on PC and has convenient graphical capabilities and a powerful user interface. (author)

  5. System testing software deployments using Docker and Kubernetes in gitlab CI: EOS + CTA use case

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    It needs to be seamlessly integrated with `EOS`, which has become the de facto disk storage system at CERN. `CTA` and `EOS` integration requires parallel development of features in both software that needs to be **synchronized and systematically tested** on a specific distributed development infrastructure for each commit in the code base. This presentation describes the full gitlab continuous integration work flow that builds, tests, deploys and run system tests of the full software stack in docker containers on our specific kubernetes infrastructure.

  6. Predicting Software Test Effort in Iterative Development Using a Dynamic Bayesian Network

    OpenAIRE

    Torkar, Richard; Awan, Nasir Majeed; Alvi, Adnan Khadem; Afzal, Wasif

    2010-01-01

    Projects following iterative software development methodologies must still be managed in a way as to maximize quality and minimize costs. However, there are indications that predicting test effort in iterative development is challenging and currently there seem to be no models for test effort prediction. This paper introduces and validates a dynamic Bayesian network for predicting test effort in iterative software devel- opment. The proposed model is validated by the use of data from two indu...

  7. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    Science.gov (United States)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  8. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    Science.gov (United States)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  9. Estimating Rates of Fault Insertion and Test Effectiveness in Software Systems

    Science.gov (United States)

    Nikora, A.; Munson, J.

    1998-01-01

    In developing a software system, we would like to estimate the total number of faults inserted into a software system, the residual fault content of that system at any given time, and the efficacy of the testing activity in executing the code containing the newly inserted faults.

  10. Mining Software Repositories to Study Co-Evolution of Production & Test Code

    NARCIS (Netherlands)

    Zaidman, A.E.; Van Rompaey, B.; Demeyer, S.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: ICST 2008 - Proceedings of the International Conference on Software Testing, Verification, and Validation, 2008; doi:10.1109/ICST.2008.47 Engineering software systems is a multidisciplinary activity, whereby a number of artifacts must be created — and maintained —

  11. Test Concept for Advanced Oxidation Techniques

    DEFF Research Database (Denmark)

    Bennedsen, Lars Rønn; Søgaard, Erik Gydesen; Mortensen, Lars

    of conducting screening laboratory and pilot tests prior to onset of full scale treatment of a contaminated site with a given technology. For this purpose, Ramboll has developed a mobile test unit in co-operation with universities and technology suppliers. The unit includes equipment for both standard and more...... the assessor ends up with 3 or 4 applicable techniques. In stead of selecting a full scale technique solely based on information collected during a literature research, it is best practice to supplement the remediation screening phase with laboratory and in situ pilot treatability tests. As well...... as establishing the applicability of the proposed technique, the treatability tests also provide essential site-specific design parameters required for the full scale system, namely; oxidant demand, delivery method, kinetics etc. Drawing up field studies and laboratory data, this poster will discus the importance...

  12. Using Microcomputer Software to Score Placement Tests--An Example from the University of California, Irvine.

    Science.gov (United States)

    Shoemaker, Judith S.; And Others

    This article describes the placement testing program, the use of microcomputer software for scoring and analyzing test results, and the integration of the computerized test results into a comprehensive microcomputer-based student information system at the University of California, Irvine (UCI). UCI offers placement tests in five academic fields:…

  13. Advanced Testing Method for Ground Thermal Conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaobing [ORNL; Clemenzi, Rick [Geothermal Design Center Inc.; Liu, Su [University of Tennessee (UT)

    2017-04-01

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce the cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.

  14. Acceptance test procedure MICON software exhaust fan control modifications; TOPICAL

    International Nuclear Information System (INIS)

    SILVAN, G.R.

    1999-01-01

    This acceptance test verifies the MICON program changes for the new automatic transfer switch ATS-2 alarms, the Closed Loop Cooling isolator status, the CB-3 position alarm, and the alarms for the new emergency fan damper backup air compressor

  15. MAC mini acceptance test procedure, software Version 3.0

    International Nuclear Information System (INIS)

    Russell, V.K.

    1994-01-01

    The K Basins Materials Accounting (MAC) programs had some major improvements made to it to organize the main-tables by Location, Canister, and Material. This ATP describes how the code was to be tested to verify its correctness

  16. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  17. Software Manages Documentation in a Large Test Facility

    Science.gov (United States)

    Gurneck, Joseph M.

    2001-01-01

    The 3MCS computer program assists and instrumentation engineer in performing the 3 essential functions of design, documentation, and configuration management of measurement and control systems in a large test facility. Services provided by 3MCS are acceptance of input from multiple engineers and technicians working at multiple locations;standardization of drawings;automated cross-referencing; identification of errors;listing of components and resources; downloading of test settings; and provision of information to customers.

  18. Testing of Advanced Conformal Ablative TPS

    Science.gov (United States)

    Gasch, Matthew; Agrawal, Parul; Beck, Robin

    2013-01-01

    In support of the CA250 project, this paper details the results of a test campaign that was conducted at the Ames Arcjet Facility, wherein several novel low density thermal protection (TPS) materials were evaluated in an entry like environment. The motivation for these tests was to investigate whether novel conformal ablative TPS materials can perform under high heat flux and shear environment as a viable alternative to rigid ablators like PICA or Avcoat for missions like MSL and beyond. A conformable TPS over a rigid aeroshell has the potential to solve a number of challenges faced by traditional rigid TPS materials (such as tiled Phenolic Impregnated Carbon Ablator (PICA) system on MSL, and honeycomb-based Avcoat on the Orion Multi Purpose Crew Vehicle (MPCV)). The compliant (high strain to failure) nature of the conformable ablative materials will allow better integration of the TPS with the underlying aeroshell structure and enable monolithic-like configuration and larger segments to be used in fabrication.A novel SPRITE1 architecture, developed by the researchers at NASA Ames was used for arcjet testing. This small probe like configuration with 450 spherecone, enabled us to test the materials in a combination of high heat flux, pressure and shear environment. The heat flux near the nose were in the range of 500-1000 W/sq cm whereas in the flank section of the test article the magnitudes were about 50 of the nose, 250-500W/sq cm range. There were two candidate conformable materials under consideration for this test series. Both test materials are low density (0.28 g/cu cm) similar to Phenolic Impregnated Carbon Ablator (PICA) or Silicone Impregnated Refractory Ceramic Ablator (SIRCA) and are comprised of: A flexible carbon substrate (Carbon felt) infiltrated with an ablative resin system: phenolic (Conformal-PICA) or silicone (Conformal-SICA). The test demonstrated a successful performance of both the conformable ablators for heat flux conditions between 50

  19. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Science.gov (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  20. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  1. EMERIS: an advanced information system for a materials testing reactor

    International Nuclear Information System (INIS)

    Adorjan, F.; Buerger, L.; Lux, I.; Mesko, L.; Szabo, K.; Vegh, J.; Ivanov, V.V.; Mozhaev, A.A.; Yakovlev, V.V.

    1990-06-01

    The basic features of the Materials Testing Reactor of IAE, Moscow (MR) Information System (EMERIS) are outlined. The purpose of the system is to support reactor and experimental test loop operators by a flexible, fully computerized and user-friendly tool for the aquisition, analysis, archivation and presentation of data obtained during operation of the experimental facility. High availability of EMERIS services is ensured by redundant hardware and software components, and by automatic configuration procedure. A novel software feature of the system is the automatic Disturbance Analysis package, which is aimed to discover primary causes of irregularities occurred in the technology. (author) 2 refs.; 2 figs

  2. Void fraction instrument software, Version 1,2, Acceptance test report

    International Nuclear Information System (INIS)

    Gimera, M.

    1995-01-01

    This provides the report for the void fraction instrument acceptance test software Version 1.2. The void fraction will collect data that will be used to calculate the quantity of gas trapped in waste tanks

  3. Design and implementation of a software tool intended for simulation and test of real time codes

    International Nuclear Information System (INIS)

    Le Louarn, C.

    1986-09-01

    The objective of real time software testing is to show off processing errors and unobserved functional requirements or timing constraints in a code. In the perspective of safety analysis of nuclear equipments of power plants testing should be carried independently from the physical process (which is not generally available), and because casual hardware failures must be considered. We propose here a simulation and test tool, integrally software, with large interactive possibilities for testing assembly code running on microprocessor. The OST (outil d'aide a la simulation et au Test de logiciels temps reel) simulates code execution and hardware or software environment behaviour. Test execution is closely monitored and many useful informations are automatically saved. The present thesis work details, after exposing methods and tools dedicated to real time software, the OST system. We show the internal mechanisms and objects of the system: particularly ''events'' (which describe evolutions of the system under test) and mnemonics (which describe the variables). Then, we detail the interactive means available to the user for constructing the test data and the environment of the tested software. Finally, a prototype implementation is presented along with the results of the tests carried out. This demonstrates the many advantages of the use of an automatic tool over a manual investigation. As a conclusion, further developments, nececessary to complete the final tool are rewieved [fr

  4. Testing of the assisting software for radiologists analysing head CT images: lessons learned.

    Science.gov (United States)

    Martynov, Petr; Mitropolskii, Nikolai; Kukkola, Katri; Gretsch, Monika; Koivisto, Vesa-Matti; Lindgren, Ilkka; Saunavaara, Jani; Reponen, Jarmo; Mäkynen, Anssi

    2017-12-11

    Assessing a plan for user testing and evaluation of the assisting software developed for radiologists. Test plan was assessed in experimental testing, where users performed reporting on head computed tomography studies with the aid of the software developed. The user testing included usability tests, questionnaires, and interviews. In addition, search relevance was assessed on the basis of user opinions. The testing demonstrated weaknesses in the initial plan and enabled improvements. Results showed that the software has acceptable usability level but some minor fixes are needed before larger-scale pilot testing. The research also proved that it is possible even for radiologists with under a year's experience to perform reporting of non-obvious cases when assisted by the software developed. Due to the small number of test users, it was impossible to assess effects on diagnosis quality. The results of the tests performed showed that the test plan designed is useful, and answers to the key research questions should be forthcoming after testing with more radiologists. The preliminary testing revealed opportunities to improve test plan and flow, thereby illustrating that arranging preliminary test sessions prior to any complex scenarios is beneficial.

  5. Software Quality and Testing: What DoD Can Learn from Commercial Practices

    Science.gov (United States)

    1992-08-31

    Defects, and Correction of Processes PROCESS IMPROVEMENT ...... I,- DEVELOPMENT ] STEST"ING -- [ PROCESO IMPROVEMENT 1 Figure 1. Software Quality Control...only when users understand the manual procedure the tool will automate, and the benefit of automating it. With regard to software testing in DoD, we can...testing - the process of exercising or evaluating a system or system components by manual or automnated means to veiify that it satisfies specified

  6. Acceptance test report MICON software exhaust fan control modifications; TOPICAL

    International Nuclear Information System (INIS)

    SILVAN, G.R.

    1999-01-01

    This report documents the results the acceptance test HNF-4108 which verifies the MICON program changes for the new automatic transfer switch ATS-2 alarms, the Closed Loop Cooling isolator status, the CB-3 position alarm, the alarms for the new emergency fan damper backup air compressor, and the generator sequencer logic

  7. The microcomputer scientific software series 4: testing prediction accuracy.

    Science.gov (United States)

    H. Michael Rauscher

    1986-01-01

    A computer program, ATEST, is described in this combination user's guide / programmer's manual. ATEST provides users with an efficient and convenient tool to test the accuracy of predictors. As input ATEST requires observed-predicted data pairs. The output reports the two components of accuracy, bias and precision.

  8. Simulation and video software development for soil consolidation testing

    NARCIS (Netherlands)

    Karim, Usama F.A.

    2003-01-01

    The development techniques and file structures of CTM, a novel multi-media (computer simulation and video) package on consolidation and laboratory consolidation testing, are presented in this paper. A courseware tool called Authorware proved to be versatile for building the package and the paper

  9. Association tests and software for copy number variant data

    Directory of Open Access Journals (Sweden)

    Plagnol Vincent

    2009-01-01

    Full Text Available Abstract Recent studies have suggested that copy number variation (CNV significantly contributes to genetic predisposition to several common disorders. These findings, combined with the imperfect tagging of CNVs by single nucleotide polymorphisms (SNPs, have motivated the development of association studies directly targeting CNVs. Several assays, including comparative genomic hybridisation arrays, SNP genotyping arrays, or DNA quantification through real-time polymerase chain reaction analysis, allow direct assessment of CNV status in cohorts sufficiently large to provide adequate statistical power for association studies. When analysing data provided by these assays, association tests for CNV data are not fundamentally different from SNP-based association tests. The main difference arises when the quality of the CNV assay is not sufficient to convert unequivocally the raw measurement into discrete calls -- a common issue, given the technological limitations of current CNV assays. When this is the case, association tests are more appropriately based on the raw continuous measurement provided by the CNV assay, instead of potentially inaccurate discrete calls, thus motivating the development of new statistical methods. Here, the programs available for CNV association testing for case control or family data are reviewed, using either discrete calls or raw continuous data.

  10. Impacto na realização de testes de software

    OpenAIRE

    Trigo, Cláudia Filipa Ferreira

    2017-01-01

    A principal razão que leva a mestranda a adotar este projeto deve-se ao facto de ser uma tester de software no âmbito da sua atividade profissional então, decidiu aprofundar o conhecimento sobre esta temática, investigando o impacto da realização de testes de software dentro de uma empresa. O principal objetivo é adquirir os conhecimentos necessários para a mestranda se tornar numa referência ao nível dos testes de software e controlo da qualidade, para isso pretende efetuar...

  11. Implementation and flight tests for the Digital Integrated Automatic Landing System (DIALS). Part 1: Flight software equations, flight test description and selected flight test data

    Science.gov (United States)

    Hueschen, R. M.

    1986-01-01

    Five flight tests of the Digital Automated Landing System (DIALS) were conducted on the Advanced Transport Operating Systems (ATOPS) Transportation Research Vehicle (TSRV) -- a modified Boeing 737 aircraft for advanced controls and displays research. These flight tests were conducted at NASA's Wallops Flight Center using the microwave landing system (MLS) installation on runway 22. This report describes the flight software equations of the DIALS which was designed using modern control theory direct-digital design methods and employed a constant gain Kalman filter. Selected flight test performance data is presented for localizer (runway centerline) capture and track at various intercept angles, for glideslope capture and track of 3, 4.5, and 5 degree glideslopes, for the decrab maneuver, and for the flare maneuver. Data is also presented to illustrate the system performance in the presence of cross, gust, and shear winds. The mean and standard deviation of the peak position errors for localizer capture were, respectively, 24 feet and 26 feet. For mild wind conditions, glideslope and localizer tracking position errors did not exceed, respectively, 5 and 20 feet. For gusty wind conditions (8 to 10 knots), these errors were, respectively, 10 and 30 feet. Ten hands off automatic lands were performed. The standard deviation of the touchdown position and velocity errors from the mean values were, respectively, 244 feet and 0.7 feet/sec.

  12. Flexible test automation a software framework for easily developing measurement applications

    CERN Document Server

    Arpaia, Pasquale; De Matteis, Ernesto

    2014-01-01

    In laboratory management of an industrial test division, a test laboratory, or a research center, one of the main activities is producing suitable software for automatic benches by satisfying a given set of requirements. This activity is particularly costly and burdensome when test requirements are variable over time. If the batches of objects have small size and frequent occurrence, the activity of measurement automation becomes predominating with respect to the test execution. Flexible Test Automation shows the development of a software framework as a useful solution to satisfy this exigency. The framework supports the user in producing measurement applications for a wide range of requirements with low effort and development time.

  13. Testing existing software for safety-related applications. Revision 7.1

    International Nuclear Information System (INIS)

    Scott, J.A.; Lawrence, J.D.

    1995-12-01

    The increasing use of commercial off-the-shelf (COTS) software products in digital safety-critical applications is raising concerns about the safety, reliability, and quality of these products. One of the factors involved in addressing these concerns is product testing. A tester's knowledge of the software product will vary, depending on the information available from the product vendor. In some cases, complete source listings, program structures, and other information from the software development may be available. In other cases, only the complete hardware/software package may exist, with the tester having no knowledge of the internal structure of the software. The type of testing that can be used will depend on the information available to the tester. This report describes six different types of testing, which differ in the information used to create the tests, the results that may be obtained, and the limitations of the test types. An Annex contains background information on types of faults encountered in testing, and a Glossary of pertinent terms is also included. This study is pertinent for safety-related software at reactors

  14. Testing existing software for safety-related applications. Revision 7.1

    Energy Technology Data Exchange (ETDEWEB)

    Scott, J.A.; Lawrence, J.D.

    1995-12-01

    The increasing use of commercial off-the-shelf (COTS) software products in digital safety-critical applications is raising concerns about the safety, reliability, and quality of these products. One of the factors involved in addressing these concerns is product testing. A tester`s knowledge of the software product will vary, depending on the information available from the product vendor. In some cases, complete source listings, program structures, and other information from the software development may be available. In other cases, only the complete hardware/software package may exist, with the tester having no knowledge of the internal structure of the software. The type of testing that can be used will depend on the information available to the tester. This report describes six different types of testing, which differ in the information used to create the tests, the results that may be obtained, and the limitations of the test types. An Annex contains background information on types of faults encountered in testing, and a Glossary of pertinent terms is also included. This study is pertinent for safety-related software at reactors.

  15. MATE: Modern Software Technology for Flight Test Automation and Orchestration, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The development of advanced technologies for flight testing, measurement, and data acquisition are critical to effectively meeting the future goals and challenges...

  16. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    Science.gov (United States)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  17. Advances in Significance Testing for Cluster Detection

    Science.gov (United States)

    Coleman, Deidra Andrea

    Over the past two decades, much attention has been given to data driven project goals such as the Human Genome Project and the development of syndromic surveillance systems. A major component of these types of projects is analyzing the abundance of data. Detecting clusters within the data can be beneficial as it can lead to the identification of specified sequences of DNA nucleotides that are related to important biological functions or the locations of epidemics such as disease outbreaks or bioterrorism attacks. Cluster detection techniques require efficient and accurate hypothesis testing procedures. In this dissertation, we improve upon the hypothesis testing procedures for cluster detection by enhancing distributional theory and providing an alternative method for spatial cluster detection using syndromic surveillance data. In Chapter 2, we provide an efficient method to compute the exact distribution of the number and coverage of h-clumps of a collection of words. This method involves defining a Markov chain using a minimal deterministic automaton to reduce the number of states needed for computation. We allow words of the collection to contain other words of the collection making the method more general. We use our method to compute the distributions of the number and coverage of h-clumps in the Chi motif of H. influenza.. In Chapter 3, we provide an efficient algorithm to compute the exact distribution of multiple window discrete scan statistics for higher-order, multi-state Markovian sequences. This algorithm involves defining a Markov chain to efficiently keep track of probabilities needed to compute p-values of the statistic. We use our algorithm to identify cases where the available approximation does not perform well. We also use our algorithm to detect unusual clusters of made free throw shots by National Basketball Association players during the 2009-2010 regular season. In Chapter 4, we give a procedure to detect outbreaks using syndromic

  18. Testing of Safety-Critical Software Embedded in an Artificial Heart

    Science.gov (United States)

    Cha, Sungdeok; Jeong, Sehun; Yoo, Junbeom; Kim, Young-Gab

    Software is being used more frequently to control medical devices such as artificial heart or robotic surgery system. While much of software safety issues in such systems are similar to other safety-critical systems (e.g., nuclear power plants), domain-specific properties may warrant development of customized techniques to demonstrate fitness of the system on patients. In this paper, we report results of a preliminary analysis done on software controlling a Hybrid Ventricular Assist Device (H-VAD) developed by Korea Artificial Organ Centre (KAOC). It is a state-of-the-art artificial heart which completed animal testing phase. We performed software testing in in-vitro experiments and animal experiments. An abnormal behaviour, never detected during extensive in-vitro analysis and animal testing, was found.

  19. System Software Testing of Laser Tracker Leica AT401

    Directory of Open Access Journals (Sweden)

    Filip Dvořáček

    2014-12-01

    Full Text Available The article introduces a group of instruments called laser trackers and specificallyfocuses on one of them - Leica AT401. At the Research Institute of Geodesy, To-pography and Cartography the instrument has been tested both in laboratory andoutdoor conditions. Several significant errors in the instrument’s system softwarehave been found, mostly due to the creation of user-programmed controlling appli-cation called ATControl. The errors are related to a selection, a computation andan evaluation procedure of the refractive index of air. Finally, notes of the newmeasurement modes of Leica AT40x are given and a range of distance measure-ment is discussed.

  20. Field testing mobile digital storytelling software in rural Kenya

    CSIR Research Space (South Africa)

    Reitmaier, T

    2010-09-01

    Full Text Available Our existing relationships with the Adiedo community allowed us to focus all our time and energy on field testing our prototype. We spent a total of seven days in-situ and recruited as research assistant, and translator, a young man, who had... completed secondary school a few years earlier. He was fluent in English and Dholuo, the mother-tongue of the Luo. The relationship with the research assistant became very important to our work, as he became essential to introducing the prototype...

  1. MBA acceptance test procedures, software Version 1.4

    International Nuclear Information System (INIS)

    Mullaney, J.E.; Russell, V.K.

    1994-01-01

    The Mass Balance Program (MBA) is an adjunct to the Materials Accounting database system, Version 3.4. MBA was written to equip the personnel performing K-Basin encapsulation tasks with a conservative estimate of accumulated sludge during the processing of canisters into and out of the chute. The K Basins Materials Balance programs had some minor improvements made to it to feedback the chute processing status to the operator better. This ATP describes how the code was to be tested to verify its correctness

  2. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Test Documentation for Digital Computer Software... practices for test documentation for software and computer systems as described in the Institute of Electrical and Electronics Engineers (IEEE) Standard 829-2008, ``IEEE Standard for Software and System Test...

  3. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Directory of Open Access Journals (Sweden)

    Shunkun Yang

    2014-01-01

    Full Text Available Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  4. Real-time extended interface automata for software testing cases generation.

    Science.gov (United States)

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  5. VEGAS2: Software for More Flexible Gene-Based Testing.

    Science.gov (United States)

    Mishra, Aniket; Macgregor, Stuart

    2015-02-01

    Gene-based tests such as versatile gene-based association study (VEGAS) are commonly used following per-single nucleotide polymorphism (SNP) GWAS (genome-wide association studies) analysis. Two limitations of VEGAS were that the HapMap2 reference set was used to model the correlation between SNPs and only autosomal genes were considered. HapMap2 has now been superseded by the 1,000 Genomes reference set, and whereas early GWASs frequently ignored the X chromosome, it is now commonly included. Here we have developed VEGAS2, an extension that uses 1,000 Genomes data to model SNP correlations across the autosomes and chromosome X. VEGAS2 allows greater flexibility when defining gene boundaries. VEGAS2 offers both a user-friendly, web-based front end and a command line Linux version. The online version of VEGAS2 can be accessed through https://vegas2.qimrberghofer.edu.au/. The command line version can be downloaded from https://vegas2.qimrberghofer.edu.au/zVEGAS2offline.tgz. The command line version is developed in Perl, R and shell scripting languages; source code is available for further development.

  6. A Description of the Software Element of the NASA EME Flight Tests

    Science.gov (United States)

    Koppen, Sandra V.

    1996-01-01

    In support of NASA's Fly-By-Light/Power-By-Wire (FBL/PBW) program, a series of flight tests were conducted by NASA Langley Research Center in February, 1995. The NASA Boeing 757 was flown past known RF transmitters to measure both external and internal radiated fields. The aircraft was instrumented with strategically located sensors for acquiring data on shielding effectiveness and internal coupling. The data are intended to support computational and statistical modeling codes used to predict internal field levels of an electromagnetic environment (EME) on aircraft. The software was an integral part of the flight tests, as well as the data reduction process. The software, which provided flight test instrument control, data acquisition, and a user interface, executes on a Hewlett Packard (HP) 300 series workstation and uses BP VEEtest development software and the C programming language. Software tools were developed for data processing and analysis, and to provide a database organized by frequency bands, test runs, and sensors. This paper describes the data acquisition system on board the aircraft and concentrates on the software portion. Hardware and software interfaces are illustrated and discussed. Particular attention is given to data acquisition and data format. The data reduction process is discussed in detail to provide insight into the characteristics, quality, and limitations of the data. An analysis of obstacles encountered during the data reduction process is presented.

  7. [Confirming the Utility of RAISUS Antifungal Susceptibility Testing by New-Software].

    Science.gov (United States)

    Ono, Tomoko; Suematsu, Hiroyuki; Sawamura, Haruki; Yamagishi, Yuka; Mikamo, Hiroshige

    2017-08-15

    Clinical and Laboratory Standards Institute (CLSI) methods for susceptibility tests of yeast are used in Japan. On the other hand, the methods have some disadvantage; 1) reading at 24 and 48 h, 2) using unclear scale, approximately 50% inhibition, to determine MICs, 3) calculating trailing growth and paradoxical effects. These makes it difficult to test the susuceptibility for yeasts. Old software of RAISUS, Ver. 6.0 series, resolved problem 1) and 2) but did not resolve problem 3). Recently, new software of RAISUS, Ver. 7.0 series, resolved problem 3). We confirmed that using the new software made it clear whether all these issue were settled or not. Eighty-four Candida isolated from Aichi Medical University was used in this study. We compared the MICs obtained by using RAISUS antifungal susceptibility testing of yeasts RSMY1, RSMY1, with those obtained by using ASTY. The concordance rates (±four-fold of MICs) between the MICs obtained by using ASTY and RSMY1 with the new software were more than 90%, except for miconazole (MCZ). The rate of MCZ was low, but MICs obtained by using CLSI methods and Yeast-like Fungus DP 'EIKEN' methods, E-DP, showed equivalent MICs of RSMY1 using the new software. The frequency of skip effects on RSMY1 using the new software markedly decreased relative to RSMY1 using the old software. In case of showing trailing growth, the new software of RAISUS made it possible to choice the correct MICs and to put up the sign of trailing growth on the result screen. New software of RAISUS enhances its usability and the accuracy of MICs. Using automatic instrument to determine MICs is useful to obtain objective results easily.

  8. Accuracy Test of Software Architecture Compliance Checking Tools – Test Instruction

    NARCIS (Netherlands)

    Pruijt, Leo; van der Werf, J.M.E.M.|info:eu-repo/dai/nl/36950674X; Brinkkemper., Sjaak|info:eu-repo/dai/nl/07500707X

    2015-01-01

    Software Architecture Compliance Checking (SACC) is an approach to verify conformance of implemented program code to high-level models of architectural design. Static SACC focuses on the modular software architecture and on the existence of rule violating dependencies between modules. Accurate tool

  9. Pre-Flight Testing and Performance of a Ka-Band Software Defined Radio

    Science.gov (United States)

    Downey, Joseph A.; Reinhart, Richard C.; Kacpura, Thomas

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed a space-qualified, reprogrammable, Ka-band Software Defined Radio (SDR) to be utilized as part of an on-orbit, reconfigurable testbed. The testbed will operate on the truss of the International Space Station beginning in late 2012. Three unique SDRs comprise the testbed, and each radio is compliant to the Space Telecommunications Radio System (STRS) Architecture Standard. The testbed provides NASA, industry, other Government agencies, and academic partners the opportunity to develop communications, navigation, and networking applications in the laboratory and space environment, while at the same time advancing SDR technology, reducing risk, and enabling future mission capability. Designed and built by Harris Corporation, the Ka-band SDR is NASA's first space-qualified Ka-band SDR transceiver. The Harris SDR will also mark the first NASA user of the Ka-band capabilities of the Tracking Data and Relay Satellite System (TDRSS) for on-orbit operations. This paper describes the testbed's Ka-band System, including the SDR, travelling wave tube amplifier (TWTA), and antenna system. The reconfigurable aspects of the system enabled by SDR technology are discussed and the Ka-band system performance is presented as measured during extensive pre-flight testing.

  10. Hawaiian Electric Advanced Inverter Test Plan - Result Summary

    Energy Technology Data Exchange (ETDEWEB)

    Hoke, Anderson; Nelson, Austin; Prabakar, Kumaraguru; Nagarajan, Adarsh

    2016-10-14

    This presentation is intended to share the results of lab testing of five PV inverters with the Hawaiian Electric Companies and other stakeholders and interested parties. The tests included baseline testing of advanced inverter grid support functions, as well as distribution circuit-level tests to examine the impact of the PV inverters on simulated distribution feeders using power hardware-in-the-loop (PHIL) techniques. hardware-in-the-loop (PHIL) techniques.

  11. Advanced Beamline Design for Fermilab's Advanced Superconducting Test Accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Prokop, Christopher [Northern Illinois Univ., DeKalb, IL (United States)

    2014-01-01

    The Advanced Superconducting Test Accelerator (ASTA) at Fermilab is a new electron accelerator currently in the commissioning stage. In addition to testing superconducting accelerating cavities for future accelerators, it is foreseen to support a variety of Advanced Accelerator R&D (AARD) experiments. Producing the required electron bunches with the expected flexibility is challenging. The goal of this dissertation is to explore via numerical simulations new accelerator beamlines that can enable the advanced manipulation of electron bunches. The work especially includes the design of a low-energy bunch compressor and a study of transverse-to-longitudinal phase space exchangers.

  12. Integrated testing and verification system for research flight software design document

    Science.gov (United States)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  13. Software architecture for the ORNL large-coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data-acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, decoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring, and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  14. Software architecture for the ORNL large coil test facility data system

    International Nuclear Information System (INIS)

    Blair, E.T.; Baylor, L.R.

    1986-01-01

    The VAX-based data acquisition system for the International Fusion Superconducting Magnet Test Facility (IFSMTF) at Oak Ridge National Laboratory (ORNL) is a second-generation system that evolved from a PDP-11/60-based system used during the initial phase of facility testing. The VAX-based software represents a layered implementation that provides integrated access to all of the data sources within the system, deoupling end-user data retrieval from various front-end data sources through a combination of software architecture and instrumentation data bases. Independent VAX processes manage the various front-end data sources, each being responsible for controlling, monitoring, acquiring and disposing data and control parameters for access from the data retrieval software. This paper describes the software architecture and the functionality incorporated into the various layers of the data system

  15. The SETI Interpreter Program (SIP). a Software Package for the SETI Field Tests

    Science.gov (United States)

    Olsen, E. T.; Lokshin, A.

    1983-01-01

    The SETI (Search for Extraterrestrial Intelligence) Interpreter Program (SIP) is an interactive software package designed to allow flexible off line processing of the SETI field test data on a PDP 11/44 computer. The user can write and immediately execute complex analysis programs using the compact SIP command language. The software utilized by the SETI Interpreter Program consists of FORTRAN - coded modules that are sequentially installed and executed.

  16. Safety prediction for basic components of safety-critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2000-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  17. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  18. Safety prediction for basic components of safety critical software based on static testing

    International Nuclear Information System (INIS)

    Son, H.S.; Seong, P.H.

    2001-01-01

    The purpose of this work is to develop a safety prediction method, with which we can predict the risk of software components based on static testing results at the early development stage. The predictive model combines the major factor with the quality factor for the components, both of which are calculated based on the measures proposed in this work. The application to a safety-critical software system demonstrates the feasibility of the safety prediction method. (authors)

  19. Evaluation of advanced driver assistance system with the VEHIL test facility: experiences and future developments at TNO automotive

    NARCIS (Netherlands)

    Kusters, L.J.J.; Gietelink, O.J.; Hoof, J.F.A.M. van; Lemmen, P.P.M.

    2004-01-01

    This paper presents the working principle, functionality and the experience during the first operational period of the VEHIL laboratory, dedicated to the development and testing of advanced driver assistance systems. The position of VEHIL and its PC based full software variant PRESCAN is illustrated

  20. A software prototype development of human system interfaces for human factors engineering validation tests of SMART MCR

    International Nuclear Information System (INIS)

    Lim, Jong Tae; Han, Kwan Ho; Yang, Seung Won

    2011-02-01

    An integrated system validation test bed used for human factors engineering validation test is being developed. This study has a goal to develop a software prototype for HFE validation of SMART MCR design. To achieve these, first, some prototype specifications of the software was developed. Then software prototypes of alarm reduction logic system, Plant Protection System, ESF-CCS, Elastic Tile Alarm Indication, and EID-based HSIs were implemented as codes. Test procedures for the software prototypes were established to verify the completeness of the codes implemented. The careful software test has been done according to these test procedures, and the result were documented

  1. Power, Avionics and Software - Phase 1.0:. [Subsystem Integration Test Report

    Science.gov (United States)

    Ivancic, William D.; Sands, Obed S.; Bakula, Casey J.; Oldham, Daniel R.; Wright, Ted; Bradish, Martin A.; Klebau, Joseph M.

    2014-01-01

    This report describes Power, Avionics and Software (PAS) 1.0 subsystem integration testing and test results that occurred in August and September of 2013. This report covers the capabilities of each PAS assembly to meet integration test objectives for non-safety critical, non-flight, non-human-rated hardware and software development. This test report is the outcome of the first integration of the PAS subsystem and is meant to provide data for subsequent designs, development and testing of the future PAS subsystems. The two main objectives were to assess the ability of the PAS assemblies to exchange messages and to perform audio testing of both inbound and outbound channels. This report describes each test performed, defines the test, the data, and provides conclusions and recommendations.

  2. Modular, Autonomous Command and Data Handling Software with Built-In Simulation and Test

    Science.gov (United States)

    Cuseo, John

    2012-01-01

    The spacecraft system that plays the greatest role throughout the program lifecycle is the Command and Data Handling System (C&DH), along with the associated algorithms and software. The C&DH takes on this role as cost driver because it is the brains of the spacecraft and is the element of the system that is primarily responsible for the integration and interoperability of all spacecraft subsystems. During design and development, many activities associated with mission design, system engineering, and subsystem development result in products that are directly supported by the C&DH, such as interfaces, algorithms, flight software (FSW), and parameter sets. A modular system architecture has been developed that provides a means for rapid spacecraft assembly, test, and integration. This modular C&DH software architecture, which can be targeted and adapted to a wide variety of spacecraft architectures, payloads, and mission requirements, eliminates the current practice of rewriting the spacecraft software and test environment for every mission. This software allows missionspecific software and algorithms to be rapidly integrated and tested, significantly decreasing time involved in the software development cycle. Additionally, the FSW includes an Onboard Dynamic Simulation System (ODySSy) that allows the C&DH software to support rapid integration and test. With this solution, the C&DH software capabilities will encompass all phases of the spacecraft lifecycle. ODySSy is an on-board simulation capability built directly into the FSW that provides dynamic built-in test capabilities as soon as the FSW image is loaded onto the processor. It includes a six-degrees- of-freedom, high-fidelity simulation that allows complete closed-loop and hardware-in-the-loop testing of a spacecraft in a ground processing environment without any additional external stimuli. ODySSy can intercept and modify sensor inputs using mathematical sensor models, and can intercept and respond to actuator

  3. Software control program for 25 kW breadboard testing. [spacecraft power supplies; high voltage batteries

    Science.gov (United States)

    Pajak, J. A.

    1981-01-01

    A data acquisition software program developed to operate in conjunction with the automated control system of the 25 kW PM Electric Power System Breadboard Test facility is described. The proram provides limited interactive control of the breadboard test while acquiring data and monitoring parameters, allowing unattended continuous operation. The breadboard test facility has two positions for operating separate configurations. The main variable in each test setup is the high voltage Ni-Cd battery.

  4. Prueba del software: más que una fase en el ciclo de vida/Software testing: more than a stage in the life cycle

    Directory of Open Access Journals (Sweden)

    Edgar Serna

    2011-12-01

    Full Text Available La prueba de software es probablemente la parte menos comprendida del ciclo de vida del desarrollo de software. En este trabajo, mediante una propuesta metodológica de cuatro fases, se muestra por qué es difícil detectar y eliminar errores, por qué es complejo el proceso de realizar pruebas y por qué es necesario prestarle más atención.Software testing probably is the least understood part of the software testing life cycle. In this work, by means of a methodological proposal of four stages, is showed why is complex the process of carrying out the testing software, why is necessary to pay it more attention and why is so difficult to detect and delete the mistakes.

  5. An experimental evaluation of the effectiveness of random testing of fault-tolerant software

    Science.gov (United States)

    Vouk, Mladen A.; Mcallister, David F.; Tai, K. C.

    1986-01-01

    Results of a fault-tolerant software (FTS) experiment are used to show deficiencies of the simple random testing approach. Testing was performed using randomly generated test cases supplemented with extremal and special value (ESV) cases. Error detection efficiency of the random testing approach, with emphasis on correlated errors, was compared to the error detecting capabilities of the ESV data and found deficient. The use of carefully designed test cases as a supplement to random testing, as well as the use of structure based testing are recommended.

  6. A testing-coverage software reliability model considering fault removal efficiency and error generation.

    Science.gov (United States)

    Li, Qiuying; Pham, Hoang

    2017-01-01

    In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.

  7. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  8. An Integrated Software Testing Framework for FPGA-Based Controllers in Nuclear Power Plants

    Directory of Open Access Journals (Sweden)

    Jaeyeob Kim

    2016-04-01

    Full Text Available Field-programmable gate arrays (FPGAs have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs.

  9. An integrated software testing framework for FGA-based controllers in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eun Sub; Yoo, Jun Beom; Lee, Young Jun; Choi, Jong Gyun

    2016-01-01

    Field-programmable gate arrays (FPGAs) have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs

  10. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data

    NARCIS (Netherlands)

    Oostenveld, R.; Fries, P.; Maris, E.G.G.; Schoffelen, J.M.

    2011-01-01

    This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow

  11. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among

  12. Building virtual pentesting labs for advanced penetration testing

    CERN Document Server

    Cardwell, Kevin

    2014-01-01

    Written in an easy-to-follow approach using hands-on examples, this book helps you create virtual environments for advanced penetration testing, enabling you to build a multi-layered architecture to include firewalls, IDS/IPS, web application firewalls, and endpoint protection, which is essential in the penetration testing world. If you are a penetration tester, security consultant, security test engineer, or analyst who wants to practice and perfect penetration testing skills by building virtual pen testing labs in varying industry scenarios, this is the book for you. This book is ideal if yo

  13. The 1st VIPEX Software Testing Report (Version 3.1)

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Jung, Woo Sik; Seo, Jae Seung

    2009-09-01

    The purposes of this report are (1) to perform a Verification and Validation (V and V) test for the VIPEX(Vital-area Identification Package EXpert) software and (2) to improve a software quality through the V and V test. The VIPEX was developed in Korea Atomic Energy Research Institute (KAERI) for the Vital Area Identification (VAI) of nuclear power plants. The version of the VIPEX which was distributed is 3.1. The VIPEX was revised based on the first V and V test and the second V and V test will be performed. We have performed the following tasks for the first V and V test on Windows XP and VISTA operating systems: - Testing basic function including fault tree editing - Writing formal reports

  14. Máquinas de arquitetura distribuída-considerações sobre testes de "software"

    OpenAIRE

    Mauro Mesquita Spinola

    1986-01-01

    Sistemas distribuídos tem sido utilizados em escalas crescentes em diversas aplicações. As diferenças existentes entre esses sistemas e os sistemas concentrados trazem impactos nas metodologias de desenvolvimento do software e dos seus testes. Uma metodologia para realizar testes de software distribuído e apresentada, identificando-se técnicas e ferramentas que podem ser adotadas com a finalidade de simplificar esta atividade. Este trabalho trata do desenvolvimento de uma metodologia para tes...

  15. Application of advanced non-destructive testing for testing the integrity of concrete foundations

    International Nuclear Information System (INIS)

    Nguyen Le Son; Nguyen Phuoc Lan; Pham The Hung; Vu Huy Thuc

    2004-01-01

    computer from the cross-hole sonic logging data by prepared software fit the expected range of Ultrasonic Pulse Velocity results from the laboratory tests and can improve the reliability of interpreted quality. The acquired capabilities are valuable asset to apply the Cross-hole sonic method - advanced non-destructive testing (NDT) technique for testing the integrity of the deep concrete foundations. (author)

  16. Advanced engineering software for in-space assembly and manned planetary spacecraft

    Science.gov (United States)

    Delaquil, Donald; Mah, Robert

    1990-01-01

    Meeting the objectives of the Lunar/Mars initiative to establish safe and cost-effective extraterrestrial bases requires an integrated software/hardware approach to operational definitions and systems implementation. This paper begins this process by taking a 'software-first' approach to systems design, for implementing specific mission scenarios in the domains of in-space assembly and operations of the manned Mars spacecraft. The technological barriers facing implementation of robust operational systems within these two domains are discussed, and preliminary software requirements and architectures that resolve these barriers are provided.

  17. A model independent S/W framework for search-based software testing.

    Science.gov (United States)

    Oh, Jungsup; Baik, Jongmoon; Lim, Sung-Hwa

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model.

  18. Random Vibration Testing of Advanced Wet Tantalum Capacitors

    Science.gov (United States)

    Teverovsky, Alexander

    2015-01-01

    Advanced wet tantalum capacitors allow for improved performance of power supply systems along with substantial reduction of size and weight of the systems that is especially beneficial for space electronics. Due to launch-related stresses, acceptance testing of all space systems includes random vibration test (RVT). However, many types of advanced wet tantalum capacitors cannot pass consistently RVT at conditions specified in MIL-PRF-39006, which impedes their use in space projects. This requires a closer look at the existing requirements, modes and mechanisms of failures, specifics of test conditions, and acceptance criteria. In this work, different lots of advanced wet tantalum capacitors from four manufacturers have been tested at step stress random vibration conditions while their currents were monitored before, during, and after the testing. It has been shown that the robustness of the parts and their reliability are mostly due to effective self-healing processes and limited current spiking or minor scintillations caused by RVT do not increase the risk of failures during operation. A simple model for scintillations events has been used to simulate current spiking during RVT and optimize test conditions. The significance of scintillations and possible effects of gas generation have been discussed and test acceptance criteria for limited current spiking have been suggested.

  19. Testing the educational potential of 3D visualization software in oral radiographic interpretation.

    Science.gov (United States)

    Vuchkova, Julijana; Maybury, Terrence S; Farah, Camile S

    2011-11-01

    There is heightened optimism about the potential of 3D visualization software as an alternative learning resource in radiology education. The purpose of this study was to investigate the effect of 3D visualization software on students' learning of oral radiographic interpretation from 2D radiographic images. Fourth-year dental students underwent a learning intervention phase of radiographic interpretation of oral pathoses using 3D visualization software. The success of the educational intervention was assessed by quantitative means, using a radiographic interpretation test, and by qualitative means, using a structured Likert-scale survey, asking students to evaluate their own learning outcomes. It was anticipated that training with the rotational mode of 3D visualization software would provide additional depth cues, enabling students to create spatial-mental models of anatomy that they can apply to 2D radiographic interpretation of oral pathoses. Although quantitative assessment did not support this, questionnaire evaluations demonstrated a positive effect of the 3D visualization software by enhancing students' learning about radiographic interpretation. Despite much optimism about the educational potential of 3D visualization software, it is important to understand the interactions between learners and such new technologies in order to identify potential advantages and limitations prior to embracing them as learning resources.

  20. Digital System Reliability Test for the Evaluation of safety Critical Software of Digital Reactor Protection System

    Directory of Open Access Journals (Sweden)

    Hyun-Kook Shin

    2006-08-01

    Full Text Available A new Digital Reactor Protection System (DRPS based on VME bus Single Board Computer has been developed by KOPEC to prevent software Common Mode Failure(CMF inside digital system. The new DRPS has been proved to be an effective digital safety system to prevent CMF by Defense-in-Depth and Diversity (DID&D analysis. However, for practical use in Nuclear Power Plants, the performance test and the reliability test are essential for the digital system qualification. In this study, a single channel of DRPS prototype has been manufactured for the evaluation of DRPS capabilities. The integrated functional tests are performed and the system reliability is analyzed and tested. The results of reliability test show that the application software of DRPS has a very high reliability compared with the analog reactor protection systems.

  1. Architecture and method for optimization of cloud resources used in software testing

    Directory of Open Access Journals (Sweden)

    Joana Coelho Vigário

    2016-03-01

    Full Text Available Nowadays systems can evolve quickly, and to this growth is associated, for example, the production of new features, or even the change of system perspective, required by the stakeholders. These conditions require the development of software testing in order to validate the systems. Run a large battery of tests sequentially can take hours. However, tests can run faster in a distributed environment with rapid availability of pre-configured systems, such as cloud computing. There is increasing demand for automation of the entire process, including integration, build, running tests and management of cloud resources.This paper aims to demonstrate the applicability of the practice continuous integration (CI in Information Systems, for automating the build and software testing performed in a distributed environment of cloud computing, in order to achieve optimization and elasticity of the resources provided by the cloud.

  2. Accuracy of Dolphin visual treatment objective (VTO) prediction software on class III patients treated with maxillary advancement and mandibular setback.

    Science.gov (United States)

    Peterman, Robert J; Jiang, Shuying; Johe, Rene; Mukherjee, Padma M

    2016-12-01

    Dolphin® visual treatment objective (VTO) prediction software is routinely utilized by orthodontists during the treatment planning of orthognathic cases to help predict post-surgical soft tissue changes. Although surgical soft tissue prediction is considered to be a vital tool, its accuracy is not well understood in tow-jaw surgical procedures. The objective of this study was to quantify the accuracy of Dolphin Imaging's VTO soft tissue prediction software on class III patients treated with maxillary advancement and mandibular setback and to validate the efficacy of the software in such complex cases. This retrospective study analyzed the records of 14 patients treated with comprehensive orthodontics in conjunction with two-jaw orthognathic surgery. Pre- and post-treatment radiographs were traced and superimposed to determine the actual skeletal movements achieved in surgery. This information was then used to simulate surgery in the software and generate a final soft tissue patient profile prediction. Prediction images were then compared to the actual post-treatment profile photos to determine differences. Dolphin Imaging's software was determined to be accurate within an error range of +/- 2 mm in the X-axis at most landmarks. The lower lip predictions were most inaccurate. Clinically, the observed error suggests that the VTO may be used for demonstration and communication with a patient or consulting practitioner. However, Dolphin should not be useful for precise treatment planning of surgical movements. This program should be used with caution to prevent unrealistic patient expectations and dissatisfaction.

  3. Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials

    Science.gov (United States)

    Keith, Theo G.

    2005-01-01

    The purpose of this report is to provide a final report for the period of 12/1/03 through 11/30/04 for NASA Cooperative Agreement NCC3-776, entitled "Elevated Temperature Testing and Modeling of Advanced Toughened Ceramic Materials." During this final period, major efforts were focused on both the determination of mechanical properties of advanced ceramic materials and the development of mechanical test methodologies under several different programs of the NASA-Glenn. The important research activities made during this period are: 1. Mechanical properties evaluation of two gas-turbine grade silicon nitrides. 2) Mechanical testing for fuel-cell seal materials. 3) Mechanical properties evaluation of thermal barrier coatings and CFCCs and 4) Foreign object damage (FOD) testing.

  4. Argonne to open new facility for advanced vehicle testing

    CERN Multimedia

    2002-01-01

    Argonne National Laboratory will open it's Advanced Powertrain Research Facility on Friday, Nov. 15. The facility is North America's only public testing facility for engines, fuel cells, electric drives and energy storage. State-of-the-art performance and emissions measurement equipment is available to support model development and technology validation (1 page).

  5. Results of Laboratory Testing of Advanced Power Strips

    Energy Technology Data Exchange (ETDEWEB)

    Earle, L. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sparn, B. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-08-01

    Presented at the ACEEE Summer Study on Energy Efficiency in Buildings on August 12-17, 2012, this presentation reports on laboratory tests of 20 currently available advanced power strip products, which reduce wasteful electricity use of miscellaneous electric loads in buildings.

  6. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  7. SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration

    Science.gov (United States)

    Han, Kyung T.

    2012-01-01

    Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…

  8. An analysis of unit tests of a flight software product line

    NARCIS (Netherlands)

    Ganesan, D.; Lindvall, M.; McComas, D.; Bartholomew, M.; Slegel, S.; Medina, B.; Krikhaar, R.; Verhoef, C.; Dharmalingam, G.; Montgomery, L.P.

    2013-01-01

    This paper presents an analysis of the unit testing approach developed and used by the Core Flight Software System (CFS) product line team at the NASA Goddard Space Flight Center (GSFC). The goal of the analysis is to understand, review, and recommend strategies for improving the CFS' existing unit

  9. The Impact of Success Maker Software on Grade 4 Math Proficiency on State Tests

    Science.gov (United States)

    Geer, Brandon Terrell

    2014-01-01

    Success Maker is an educational software that differentiates and personalizes K-8 reading and math. Limited research has been conducted on the impact of Success Maker on Grade 4 math state tests. At the research site, located in southeastern United States, 33.7% of fourth grade students did not pass the Palmetto Assessment of State Standards…

  10. A test profile analysis framework for assessing the reliability of software component assemblies

    OpenAIRE

    Padilla Zarate, Gerardo

    2007-01-01

    The objective of this research is to produce an early reliability assessment of a sequential assembly of software components using limited component execution-related information and considering the expected assembly use. Accomplishing this objective provides quantitative means to support design decisions and to improve the component selection process. The execution-related information, called execution traces, is gathered during the component testing process.

  11. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  12. Refinements and Tests of an Advanced Controller to Mitigate Fatigue Loads in the Controls Advanced Research Turbine: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A.; Fleming, P.

    2010-12-01

    Wind turbines are complex, nonlinear, dynamic systems forced by aerodynamic, gravitational, centrifugal, and gyroscopic loads. The aerodynamics of wind turbines are nonlinear, unsteady, and complex. Turbine rotors are subjected to a complicated 3-D turbulent wind inflow field, with imbedded coherent vortices that drive fatigue loads and reduce lifetime. Design of control algorithms for wind turbines must account for multiple control objectives. Future large multi-megawatt turbines must be designed with lighter weight structures, using active controls to mitigate fatigue loads, while maximizing energy capture. Active damping should be added to these dynamic structures to maintain stability for operation in a complex environment. At the National Renewable Energy Laboratory (NREL), we have designed, implemented, and tested advanced controls to maximize energy extraction and reduce structural dynamic loads. These control designs are based on linear models of the turbine that are generated by specialized modeling software. In this paper, we present field test results of an advanced control algorithm to mitigate blade, tower, and drivetrain loads in Region 3.

  13. Usability Testing for Developing Effective Interactive Multimedia Software: Concepts, Dimensions, and Procedures

    Directory of Open Access Journals (Sweden)

    Sung Heum Lee

    1999-04-01

    Full Text Available Usability testing is a dynamic process that can be used throughout the process of developing interactive multimedia software. The purpose of usability testing is to find problems and make recommendations to improve the utility of a product during its design and development. For developing effective interactive multimedia software, dimensions of usability testing were classified into the general categories of: learnability; performance effectiveness; flexibility; error tolerance and system integrity; and user satisfaction. In the process of usability testing, evaluation experts consider the nature of users and tasks, tradeoffs supported by the iterative design paradigm, and real world constraints to effectively evaluate and improve interactive multimedia software. Different methods address different purposes and involve a combination of user and usability testing, however, usability practitioners follow the seven general procedures of usability testing for effective multimedia development. As the knowledge about usability testing grows, evaluation experts will be able to choose more effective and efficient methods and techniques that are appropriate to their goals.

  14. Advanced stellar compass deep space navigation, ground testing results

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Jørgensen, Peter Siegbjørn

    2006-01-01

    . Nevertheless, up to now, ground navigation has been the only possible solution. The technological breakthrough of advanced star trackers, like the micro-advanced stellar compass (mu ASC) might change this situation. Indeed, exploiting the capabilities of this instrument, the authors have devised a method...... to determine the orbit of a spacecraft autonomously, on-board and without any a priori knowledge of any kind. The solution is robust, elegant and fast. This paper presents the preliminary performances obtained during the ground tests. The results are very positive and encouraging....

  15. Development of Ada language control software for the NASA power management and distribution test bed

    Science.gov (United States)

    Wright, Ted; Mackin, Michael; Gantose, Dave

    1989-01-01

    The Ada language software developed to control the NASA Lewis Research Center's Power Management and Distribution testbed is described. The testbed is a reduced-scale prototype of the electric power system to be used on space station Freedom. It is designed to develop and test hardware and software for a 20-kHz power distribution system. The distributed, multiprocessor, testbed control system has an easy-to-use operator interface with an understandable English-text format. A simple interface for algorithm writers that uses the same commands as the operator interface is provided, encouraging interactive exploration of the system.

  16. Rotor Performance at High Advance Ratio: Theory versus Test

    Science.gov (United States)

    Harris, Franklin D.

    2008-01-01

    Five analytical tools have been used to study rotor performance at high advance ratio. One is representative of autogyro rotor theory in 1934 and four are representative of helicopter rotor theory in 2008. The five theories are measured against three sets of well documented, full-scale, isolated rotor performance experiments. The major finding of this study is that the decades spent by many rotorcraft theoreticians to improve prediction of basic rotor aerodynamic performance has paid off. This payoff, illustrated by comparing the CAMRAD II comprehensive code and Wheatley & Bailey theory to H-34 test data, shows that rational rotor lift to drag ratios are now predictable. The 1934 theory predicted L/D ratios as high as 15. CAMRAD II predictions compared well with H-34 test data having L/D ratios more on the order of 7 to 9. However, the detailed examination of the selected codes compared to H-34 test data indicates that not one of the codes can predict to engineering accuracy above an advance ratio of 0.62 the control positions and shaft angle of attack required for a given lift. There is no full-scale rotor performance data available for advance ratios above 1.0 and extrapolation of currently available data to advance ratios on the order of 2.0 is unreasonable despite the needs of future rotorcraft. Therefore, it is recommended that an overly strong full-scale rotor blade set be obtained and tested in a suitable wind tunnel to at least an advance ratio of 2.5. A tail rotor from a Sikorsky CH-53 or other large single rotor helicopter should be adequate for this exploratory experiment.

  17. Meta-DiSc: a software for meta-analysis of test accuracy data.

    Science.gov (United States)

    Zamora, Javier; Abraira, Victor; Muriel, Alfonso; Khan, Khalid; Coomarasamy, Arri

    2006-07-12

    Systematic reviews and meta-analyses of test accuracy studies are increasingly being recognised as central in guiding clinical practice. However, there is currently no dedicated and comprehensive software for meta-analysis of diagnostic data. In this article, we present Meta-DiSc, a Windows-based, user-friendly, freely available (for academic use) software that we have developed, piloted, and validated to perform diagnostic meta-analysis. Meta-DiSc a) allows exploration of heterogeneity, with a variety of statistics including chi-square, I-squared and Spearman correlation tests, b) implements meta-regression techniques to explore the relationships between study characteristics and accuracy estimates, c) performs statistical pooling of sensitivities, specificities, likelihood ratios and diagnostic odds ratios using fixed and random effects models, both overall and in subgroups and d) produces high quality figures, including forest plots and summary receiver operating characteristic curves that can be exported for use in manuscripts for publication. All computational algorithms have been validated through comparison with different statistical tools and published meta-analyses. Meta-DiSc has a Graphical User Interface with roll-down menus, dialog boxes, and online help facilities. Meta-DiSc is a comprehensive and dedicated test accuracy meta-analysis software. It has already been used and cited in several meta-analyses published in high-ranking journals. The software is publicly available at http://www.hrc.es/investigacion/metadisc_en.htm.

  18. The nightly build and test system for LCG AA and LHCb software

    International Nuclear Information System (INIS)

    Kruzelecki, Karol; Roiser, Stefan; Degaudenzi, Hubert

    2010-01-01

    The core software stack both from the LCG Application Area and LHCb consists of more than 25 C++/Fortran/Python projects built for about 20 different configurations on Linux, Windows and MacOSX. To these projects, one can also add about 70 external software packages (Boost, Python, Qt, CLHEP, ...) which also have to be built for the same configurations. It order to reduce the time of the development cycle and assure the quality, a framework has been developed for the daily (in fact nightly) build and test of the software. Performing the build and the tests on several configurations and platforms increases the efficiency of the unit and integration tests. Main features: - flexible and fine grained setup (full, partial build) through a web interface; - possibility to build several 'slots' with different configurations; - precise and highly granular reports on a web server; - support for CMT projects (but not only) with their cross-dependencies; - scalable client-server architecture for the control machine and its build machines; - copy of the results in a common place to allow early view of the software stack. The nightly build framework is written in Python for portability and it is easily extensible to accommodate new build procedures.

  19. HITCal: a software tool for analysis of video head impulse test responses.

    Science.gov (United States)

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  20. Software engineering design theory and practice

    CERN Document Server

    Otero, Carlos

    2012-01-01

    … intended for use as a textbook for an advanced course in software design. Each chapter ends with review questions and references. … provides an overview of the software development process, something that would not be out of line in a course on software engineering including such topics as software process, software management, balancing conflicting values of stakeholders, testing, quality, and ethics. The author has principally focused on software design though, extracting the design phase from the surrounding software development lifecycle. … Software design strategies are addressed

  1. Hardware-Software Complex for Functional and Parametric Tests of ARM Microcontrollers STM32F1XX

    Directory of Open Access Journals (Sweden)

    Egorov Aleksey

    2016-01-01

    Full Text Available The article presents the hardware-software complex for functional and parametric tests of ARM microcontrollers STM32F1XX. The complex is based on PXI devices by National Instruments and LabVIEW software environment. Data exchange procedure between a microcontroller under test and the complex hardware is describes. Some test results are also presented.

  2. Developing and Testing of a Software Prototype to Support Diagnostic Reasoning of Nursing Students.

    Science.gov (United States)

    de Sousa, Vanessa Emille Carvalho; de Oliveira Lopes, Marcos Venícios; Keenan, Gail M; Lopez, Karen Dunn

    2018-04-01

    To design and test educational software to improve nursing students' diagnostic reasoning through NANDA-I-based clinical scenarios. A mixed method approach was used and included content validation by a panel of 13 experts and prototype testing by a sample of 56 students. Experts' suggestions included writing adjustments, new response options, and replacement of clinical information on the scenarios. Percentages of students' correct answers were 65.7%, 62.2%, and 60.5% for related factors, defining characteristics, and nursing diagnoses, respectively. Full development of this software shows strong potential for enhancing students' diagnostic reasoning. New graduates may be able to apply diagnostic reasoning more rapidly by exercising their diagnostic skills within this software. Desenvolver e testar um protótipo de software educativo para melhorar o raciocínio diagnóstico de estudantes de enfermagem. MÉTODOS: Uma abordagem mista foi utilizada e incluiu validação de conteúdo por 13 experts e testagem do protótipo por 56 estudantes. Sugestões dos experts incluíram ajustes na escrita, inclusão de novas opções de resposta e substituição de dados clínicos nos cenários. Os percentuais de respostas corretas dos estudantes foram 65,7%, 62,2% e 60,5% para fatores relacionados, características definidoras e diagnósticos de enfermagem respectivamente. CONCLUSÃO: O desenvolvimento deste software tem um forte potencial para melhorar o raciocínio diagnóstico de estudantes. IMPLICAÇÕES PARA A PRÁTICA EM ENFERMAGEM: Através deste software, enfermeiros poderão ser capazes de exercitar o raciocínio diagnóstico e aplicá-lo mais rapidamente. © 2016 NANDA International, Inc.

  3. Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests

    Science.gov (United States)

    Douglas, Freddie; Bourgeois, Edit Kaminsky

    2005-01-01

    The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).

  4. Advanced In-pile Instrumentation for Material and Test Reactors

    International Nuclear Information System (INIS)

    Rempe, J.L.; Knudson, D.L.; Daw, J.E.; Unruh, T.C.; Chase, B.M.; Davis, K.L.; Palmer, A.J.; Schley, R.S.

    2013-06-01

    The US Department of Energy sponsors the Advanced Test Reactor (ATR) National Scientific User Facility (NSUF) program to promote U.S. research in nuclear science and technology. By attracting new research users - universities, laboratories, and industry - the ATR NSUF facilitates basic and applied nuclear research and development, advancing U.S. energy security needs. A key component of the ATR NSUF effort is to design, develop, and deploy new in-pile instrumentation techniques that are capable of providing real-time measurements of key parameters during irradiation. This paper describes the strategy developed by the Idaho National Laboratory (INL) for identifying instrumentation needed for ATR irradiation tests and the program initiated to obtain these sensors. New sensors developed from this effort are identified; and the progress of other development efforts is summarized. As reported in this paper, INL staff is currently involved in several tasks to deploy real-time length and flux detection sensors, and efforts have been initiated to develop a crack growth test rig. Tasks evaluating 'advanced' technologies, such as fiber-optics based length detection and ultrasonic thermometers are also underway. In addition, specialized sensors for real-time detection of temperature and thermal conductivity are not only being provided to NSUF reactors, but are also being provided to several international test reactors. (authors)

  5. Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage

    Science.gov (United States)

    Pérez Lamancha, Beatriz; Polo Usaola, Macario

    A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.

  6. On testing of functionally equivalent components of fault-tolerant software

    Science.gov (United States)

    Vouk, Mladen A.; Helsabeck, Michael L.; Tai, Kuo-Chung; Mcallister, David F.

    1986-01-01

    Six functionally equivalent programs were tested with specification based random and extremal/special value (ESV) test cases. Statement and branch coverage were used to measure and compare the attained testing effectiveness. It was observed that both measures reached a nearly steady state value after 25 to 75 random test cases. Coverage saturation curves appear to follow an exponential growth model. However, the steady state values for branch coverage of different components, but the same input cases, differed by as much as 22 percent. The effect is the result of the differences in the detailed structure of the components. Improvement in coverage provided by the random test data, after the ESV cases were executed, was only about 1 percent. Results indicate that extensive random testing can be a process of diminishing returns, and that in the FTS context functional ('black box') testing can provide a very uneven execution coverage of the functionally equivalent software, and therefore should be supplemented by structure based testing.

  7. Roles for Software Technologies in Advancing Research and Theory in Educational Psychology

    Science.gov (United States)

    Hadwin, Allyson F.; Winne, Philip H.; Nesbit, John C.

    2005-01-01

    While reviews abound on theoretical topics in educational psychology, it is rare that we examine our field's instrumentation development, and what effects this has on educational psychology's evolution. To repair this gap, this paper investigates and reveals the implications of software technologies for researching and theorizing about core issues…

  8. Advanced Grid Support Functionality Testing for Florida Power and Light

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, Austin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Martin, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hurtt, James [Florida Power and Light, Juno Beach, FL (United States)

    2017-03-21

    This report describes the results of laboratory testing of advanced photovoltaic (PV) inverter testing undertaken by the National Renewable Energy Laboratory (NREL) on behalf of the Florida Power and Light Company (FPL). FPL recently commissioned a 1.1 MW-AC PV installation on a solar carport at the Daytona International Speedway in Daytona Beach, Florida. In addition to providing a source of clean energy production, the site serves as a live test bed with 36 different PV inverters from eight different manufacturers. Each inverter type has varied support for advanced grid support functions (GSFs) that are becoming increasingly commonplace, and are being required through revised interconnection standards such as UL1741, IEEE1547, and California (CA) Rule 21. FPL is interested in evaluating the trade-offs between different GSFs, their compliance to emerging standards, and their effects on efficiency and reliability. NREL has provided a controlled laboratory environment to undertake such a study. This work covered nine different classes of tests to compare inverter capabilities and performance for four different inverters that were selected by FPL. The test inverters were all three-phase models rated between 24-36 kW, and containing multiple PV input power point trackers. Advanced grid support functions were tested for functional behavior, and included fixed power factor operation, voltage-ride through, frequency ride-through, volt-var control, and frequency-Watt control. Response to abnormal grid conditions with GSFs enabled was studied through anti-islanding, fault, and load rejection overvoltage tests. Finally, efficiency was evaluated across a range of operating conditions that included power factor, output power, and input voltage variations. Test procedures were derived from requirements of a draft revision of UL741, CA Rule 21, and/or previous studies at NREL. This reports summarizes the results of each test case, providing a comparative performance analysis

  9. LOFT advanced densitometer L1-4 test

    International Nuclear Information System (INIS)

    Wood, D.B.

    1978-01-01

    The report covers the PC-2, C-beam chordal average density measurement made on the loss-of-fluid test (LOFT) primary coolant system hot leg during the L1-4 nonnuclear loss-of-coolant accident (LOCA) test conducted May 3, 1977. The P-2, C-beam, or LOFT advanced densitometer, used was of the pulse height analysis/energy discrimination, or nuclear hardened type to be used for LOFT nuclear tests. The L1-4 test verified the applicability of pulse height analysis/energy discrimination techniques of the nuclear hardened gamma densitometer. Test results show that the reactor coolant fluid chordal average density can be calculated from gamma radiation source signal measured count rate data

  10. Phase 1 Development Testing of the Advanced Manufacturing Demonstrator Engine

    Science.gov (United States)

    Case, Nicholas L.; Eddleman, David E.; Calvert, Marty R.; Bullard, David B.; Martin, Michael A.; Wall, Thomas R.

    2016-01-01

    The Additive Manufacturing Development Breadboard Engine (BBE) is a pressure-fed liquid oxygen/pump-fed liquid hydrogen (LOX/LH2) expander cycle engine that was built and operated by NASA at Marshall Space Flight Center's East Test Area. The breadboard engine was conceived as a technology demonstrator for the additive manufacturing technologies for an advanced upper stage prototype engine. The components tested on the breadboard engine included an ablative chamber, injector, main fuel valve, turbine bypass valve, a main oxidizer valve, a mixer and the fuel turbopump. All parts minus the ablative chamber were additively manufactured. The BBE was successfully hot fire tested seven times. Data collected from the test series will be used for follow on demonstration tests with a liquid oxygen turbopump and a regeneratively cooled chamber and nozzle.

  11. The HIE-ISOLDE alignment and monitoring system software and test mock up

    CERN Document Server

    Kautzmann, G; Kadi, Y; Leclercq, Y; Waniorek, S; Williams, L

    2012-01-01

    For the HIE Isolde project a superconducting linac will be built at CERN in the Isolde facility area. The linac will be based on the creation and installation of 2 high- β and 4 low- β cryomodules containing respectively 5 high-β superconducting cavities and 1 superconducting solenoid for the two first ones, 6 low-β superconducting cavities and 2 superconducting solenoids for the four other ones. An alignment and monitoring system of the RF cavities and solenoids placed inside the cryomodules is needed to reach the optimum linac working conditions. The alignment system is based on opto-electronics, optics and precise mechanical instrumentation. The geometrical frame configuration, the data acquisition and the 3D adjustment will be managed using a dedicated software application. In parallel to the software development, an alignment system test mock-up has been built for software validation and dimensional tests. This paper will present the software concept and the development status, and then will describe...

  12. Providing an empirical basis for optimizing the verification and testing phases of software development

    Science.gov (United States)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  13. Utility advanced turbine systems (ATS) technology readiness testing

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-09-15

    The overall objective of the Advanced Turbine System (ATS) Phase 3 Cooperative Agreement between GE and the US Department of Energy (DOE) is the development of a highly efficient, environmentally superior, and cost-competitive utility ATS for base-load utility-scale power generation, the GE 7H (60 Hz) combined cycle power system, and related 9H (50 Hz) common technology. The major effort will be expended on detail design. Validation of critical components and technologies will be performed, including: hot gas path component testing, sub-scale compressor testing, steam purity test trials, and rotational heat transfer confirmation testing. Processes will be developed to support the manufacture of the first system, which was to have been sited and operated in Phase 4 but will now be sited and operated commercially by GE. This change has resulted from DOE's request to GE for deletion of Phase 4 in favor of a restructured Phase 3 (as Phase 3R) to include full speed, no load (FSNL) testing of the 7H gas turbine. Technology enhancements that are not required for the first machine design but will be critical for future ATS advances in performance, reliability, and costs will be initiated. Long-term tests of materials to confirm design life predictions will continue. A schematic of the GE H machine is shown.

  14. Utility Advanced Turbine Systems (ATS) technology readiness testing

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-05-01

    The overall objective of the Advanced Turbine System (ATS) Phase 3 Cooperative Agreement between GE and the US Department of Energy (DOE) is the development of the GE 7H and 9H combined cycle power systems. The major effort will be expended on detail design. Validation of critical components and technologies will be performed, including: hot gas path component testing, sub-scale compressor testing, steam purity test trials, and rotational heat transfer confirmation testing. Processes will be developed to support the manufacture of the first system, which was to have been sited and operated in Phase 4 but will now be sited and operated commercially by GE. This change has resulted horn DOE's request to GE for deletion of Phase 4 in favor of a restructured Phase 3 (as Phase 3R) to include fill speed, no load (FSNL) testing of the 7H gas turbine. Technology enhancements that are not required for the first machine design but will be critical for future ATS advances in performance, reliability, and costs will be initiated. Long-term tests of materials to confirm design life predictions will continue. A schematic of the GE H machine is shown.

  15. CSE database: extended annotations and new recommendations for ECG software testing.

    Science.gov (United States)

    Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie

    2017-08-01

    Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow

  16. Architecture-Based Unit Testing of the Flight Software Product Line

    Science.gov (United States)

    Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David; Bartholomew, Maureen; Slegel, Steve; Medina, Barbara

    2010-01-01

    This paper presents an analysis of the unit testing approach developed and used by the Core Flight Software (CFS) product line team at the NASA GSFC. The goal of the analysis is to understand, review, and reconunend strategies for improving the existing unit testing infrastructure as well as to capture lessons learned and best practices that can be used by other product line teams for their unit testing. The CFS unit testing framework is designed and implemented as a set of variation points, and thus testing support is built into the product line architecture. The analysis found that the CFS unit testing approach has many practical and good solutions that are worth considering when deciding how to design the testing architecture for a product line, which are documented in this paper along with some suggested innprovennents.

  17. SmartUnit: Empirical Evaluations for Automated Unit Testing of Embedded Software in Industry

    OpenAIRE

    Zhang, Chengyu; Yan, Yichen; Zhou, Hanru; Yao, Yinbo; Wu, Ke; Su, Ting; Miao, Weikai; Pu, Geguang

    2018-01-01

    In this paper, we aim at the automated unit coverage-based testing for embedded software. To achieve the goal, by analyzing the industrial requirements and our previous work on automated unit testing tool CAUT, we rebuild a new tool, SmartUnit, to solve the engineering requirements that take place in our partner companies. SmartUnit is a dynamic symbolic execution implementation, which supports statement, branch, boundary value and MC/DC coverage. SmartUnit has been used to test more than one...

  18. Enhanced In-Pile Instrumentation at the Advanced Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    J. Rempe; D. Knudson; J. Daw; T. Unruh; B. Chase; K. Condie

    2011-06-01

    Many of the sensors deployed at materials and test reactors cannot withstand the high flux/high temperature test conditions often requested by users at U.S. test reactors, such as the Advanced Test Reactor (ATR) at the Idaho National Laboratory (INL). To address this issue, an instrumentation development effort was initiated as part of the ATR National Scientific User Facility (NSUF) in 2007 to support the development and deployment of enhanced in-pile sensors. This paper reports results from this effort. Specifically, this paper identifies the types of sensors currently available to support in-pile irradiations and those sensors currently available to ATR users. Accomplishments from new sensor technology deployment efforts are highlighted by describing new temperature and thermal conductivity sensors now available to ATR users. Efforts to deploy enhanced in-pile sensors for detecting elongation and real-time flux detectors are also reported, and recently-initiated research to evaluate the viability of advanced technologies to provide enhanced accuracy for measuring key parameters during irradiation testing are noted.

  19. Accuracy of Dolphin visual treatment objective (VTO prediction software on class III patients treated with maxillary advancement and mandibular setback

    Directory of Open Access Journals (Sweden)

    Robert J. Peterman

    2016-06-01

    Full Text Available Abstract Background Dolphin® visual treatment objective (VTO prediction software is routinely utilized by orthodontists during the treatment planning of orthognathic cases to help predict post-surgical soft tissue changes. Although surgical soft tissue prediction is considered to be a vital tool, its accuracy is not well understood in tow-jaw surgical procedures. The objective of this study was to quantify the accuracy of Dolphin Imaging’s VTO soft tissue prediction software on class III patients treated with maxillary advancement and mandibular setback and to validate the efficacy of the software in such complex cases. Methods This retrospective study analyzed the records of 14 patients treated with comprehensive orthodontics in conjunction with two-jaw orthognathic surgery. Pre- and post-treatment radiographs were traced and superimposed to determine the actual skeletal movements achieved in surgery. This information was then used to simulate surgery in the software and generate a final soft tissue patient profile prediction. Prediction images were then compared to the actual post-treatment profile photos to determine differences. Results Dolphin Imaging’s software was determined to be accurate within an error range of +/− 2 mm in the X-axis at most landmarks. The lower lip predictions were most inaccurate. Conclusions Clinically, the observed error suggests that the VTO may be used for demonstration and communication with a patient or consulting practitioner. However, Dolphin should not be useful for precise treatment planning of surgical movements. This program should be used with caution to prevent unrealistic patient expectations and dissatisfaction.

  20. Analysing sensory panel performance in a proficiency test using the PanelCheck software

    DEFF Research Database (Denmark)

    Tomic, O.; Luciano, G.; Nilsen, A.

    2010-01-01

    This paper discusses statistical methods and a workflow strategy for comparing performance across multiple sensory panels that participated in a proficiency test (also referred to as inter laboratory test). Performance comparison and analysis are based on a data set collected from 26 sensory panels......Check software, a workflow is proposed that guides the user through the data analysis process. This allows practitioners and non-statisticians to get an overview over panel performances in a rapid manner without the need to be familiar with details on the statistical methods. Visualisation of data analysis...... results plays an important role as this provides a time saving and efficient way of screening and investigating sensory panel performances. Most of the statistical methods used in this paper are available in the open source software PanelCheck, which may be downloaded and used for free....

  1. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    Directory of Open Access Journals (Sweden)

    M. Sirviö

    2009-01-01

    Full Text Available ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are today old fashioned and predicting just shrinkage porosity. Flow Science, VTT and Simtech have developed new software called FLOW-3D Cast ® , whichcan simulate surface defects, air entrainment, filters, core gas problems and even a cavitation.

  2. Advanced Decision-Oriented Software for the Management of Hazardous Substances. Part 1: Structure and Design

    OpenAIRE

    Fedra, K.

    1985-01-01

    Many industrial products and residuals such as hazardous and toxic substances are harmful to the basic life support system of the environment. In order to ensure a sustainable use of the biosphere for present and future generations, it is imperative that these substances are managed in a safe and systematic manner. The aim of this project is to provide software tools which can be used by those engaged in the management of the environment, industrial production, products, and waste streams, an...

  3. Casting directly from a computer model by using advanced simulation software FLOW-3D Cast ®

    OpenAIRE

    M. Sirviö; M. Woś

    2009-01-01

    ConiferRob - A patternless casting technique, originally conceived at VTT Technical Research Centre of Finland and furtherdeveloped at its spin-off company, Simtech Systems, offers up to 40% savings in product development costs, and up to two months shorterdevelopment times compared to conventional techniques. Savings of this order can be very valuable on today's highly competitivemarkets. Casting simulation is commonly used for designing of casting systems. However, most of the software are ...

  4. Virtual test: A student-centered software to measure student's critical thinking on human disease

    Science.gov (United States)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  5. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  6. LHCb: The nightly build and test system for LCG AA and LHCb software

    CERN Multimedia

    Kruzelecki, K; Degaudenzi, H

    2009-01-01

    The core software stack both from the LCG Application Area and LHCb consists of more than 25 C++/Fortran/Python projects build for about 20 different configurations on Linux, Windows and MacOSX. To these projects, one can also add about 20 external software packages (Boost, Python, Qt, CLHEP, ...) which have also to be build for the same configurations. It order to reduce the time of the development cycle and increase the quality insurance, a framework has been developed for the daily (nightly actually) build and test of the software. Performing the build and the tests on several configurations and platform allows to increase the efficiency of the unit and integration tests. Main features: - flexible and fine grained setup (full, partial build) through a web interface - possibility to build several "slots" with different configurations - precise and highly granular reports on a web server - support for CMT projects (but not only) with their cross-dependencies. - scalable client -server architecture for the co...

  7. The Alignment of Software Testing Skills of IS Students with Industry Practices--A South African Perspective

    Science.gov (United States)

    Scott, Elsje; Zadirov, Alexander; Feinberg, Sean; Jayakody, Ruwanga

    2004-01-01

    Software testing is a crucial component in the development of good quality systems in industry. For this reason it was considered important to investigate the extent to which the Information Systems (IS) syllabus at the University of Cape Town (UCT) was aligned with accepted software testing practices in South Africa. For students to be effective…

  8. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  9. Information on the Advanced Plant Experiment (APEX) Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    The purpose of this report provides information related to the design of the Oregon State University Advanced Plant Experiment (APEX) test facility. Information provided in this report have been pulled from the following information sources: Reference 1: R. Nourgaliev and et.al, "Summary Report on NGSAC (Next-Generation Safety Analysis Code) Development and Testing," Idaho National Laboratory, 2011. Note that this is report has not been released as an external report. Reference 2: O. Stevens, Characterization of the Advanced Plant Experiment (APEX) Passive Residual Heat Removal System Heat Exchanger, Master Thesis, June 1996. Reference 3: J. Reyes, Jr., Q. Wu, and J. King, Jr., Scaling Assessment for the Design of the OSU APEX-1000 Test Facility, OSU-APEX-03001 (Rev. 0), May 2003. Reference 4: J. Reyes et al, Final Report of the NRC AP600 Research Conducted at Oregon State University, NUREG/CR-6641, July 1999. Reference 5: K. Welter et al, APEX-1000 Confirmatory Testing to Support AP1000 Design Certification (non-proprietary), NUREG-1826, August 2005.

  10. UTILITY ADVANCED TURBINE SYSTEMS (ATS) TECHNOLOGY READINESS TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Unknown

    1999-04-01

    The overall objective of the Advanced Turbine System (ATS) Phase 3 Cooperative Agreement between GE and the U.S. Department of Energy (DOE) is the development of the GE 7H and 9H combined cycle power systems. The major effort will be expended on detail design. Validation of critical components and technologies will be performed, including: hot gas path component testing, sub-scale compressor testing, steam purity test trials, and rotational heat transfer conflation testing. Processes will be developed to support the manufacture of the first system, which was to have been sited and operated in Phase 4 but will now be sited and operated commercially by GE. This change has resulted from DOE's request to GE for deletion of Phase 4 in favor of a restructured Phase 3 (as Phase 3R) to include full speed, no load (FSNL) testing of the 7H gas turbine. Technology enhancements that are not required for the first machine design but will be critical for future ATS advances in performance, reliability, and costs will be initiated. Long-term tests of materials to confirm design life predictions will continue. The objective of this task is to design 7H and 9H compressor rotor and stator structures with the goal of achieving high efficiency at lower cost and greater durability by applying proven GE Power Systems (GEPS) heavy-duty use design practices. The designs will be based on the GE Aircraft Engines (GEAE) CF6-80C2 compressor. Transient and steady-state thermo-mechanical stress analyses will be run to ensure compliance with GEPS life standards. Drawings will be prepared for forgings, castings, machining, and instrumentation for full speed, no load (FSNL) tests of the first unit on both 9H and 7H applications.

  11. Features of the Upgraded Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software

    Science.gov (United States)

    Mason, Michelle L.; Rufer, Shann J.

    2016-01-01

    The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) software is used at the NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used in the design of thermal protection systems for hypersonic vehicles that are exposed to severe aeroheating loads, such as reentry vehicles during descent and landing procedures. This software program originally was written in the PV-WAVE(Registered Trademark) programming language to analyze phosphor thermography data from the two-color, relative-intensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the program was migrated to MATLAB(Registered Trademark) syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to perform diagnostic checks of the accuracy of the acquired data during a wind tunnel test, to extract data along a specified multi-segment line following a feature such as a leading edge or a streamline, and to batch process all of the temporal frame data from a wind tunnel run. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy software to validate the program. The absolute differences between the heat transfer data output from the two programs were on the order of 10(exp -5) to 10(exp -7). IHEAT 4.0 replaces the PV-WAVE(Registered Trademark) version as the production software for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.

  12. Development of Software and Strategies for Battery Management System Testing on HIL Simulator

    DEFF Research Database (Denmark)

    Fleischer, Christian; Sauer, Dirk Uwe; Barreras, Jorge Varela

    2016-01-01

    . This is particularly the case of tests at early stages in the development process or during fault simulation. However, the use of a HIL battery simulator requires the development of software (SW) and strategies for testing. While the possibilities are immense, it should be noted that the greater the level......In comparison with tests conducted on real Li-ion batteries, Battery Management System (BMS) tests conducted on a Hardware-In-the-Loop (HIL) battery simulator may be more cost and time effective, more flexible and traceable, easier to reproduce and safer beyond the normal range of operation...... of complexity of the tests, the higher the demands for ad hoc development of SW and strategies. With regard to the latter, there is not a universal definition and there are different points of view. Therefore different strategies may be followed, which can be classified into many different ways according...

  13. Advanced Test Reactor National Scientific User Facility Partnerships

    International Nuclear Information System (INIS)

    Marshall, Frances M.; Allen, Todd R.; Benson, Jeff B.; Cole, James I.; Thelen, Mary Catherine

    2012-01-01

    In 2007, the United States Department of Energy designated the Advanced Test Reactor (ATR), located at Idaho National Laboratory, as a National Scientific User Facility (NSUF). This designation made test space within the ATR and post-irradiation examination (PIE) equipment at INL available for use by researchers via a proposal and peer review process. The goal of the ATR NSUF is to provide researchers with the best ideas access to the most advanced test capability, regardless of the proposer's physical location. Since 2007, the ATR NSUF has expanded its available reactor test space, and obtained access to additional PIE equipment. Recognizing that INL may not have all the desired PIE equipment, or that some equipment may become oversubscribed, the ATR NSUF established a Partnership Program. This program enables and facilitates user access to several university and national laboratories. So far, seven universities and one national laboratory have been added to the ATR NSUF with capability that includes reactor-testing space, PIE equipment, and ion beam irradiation facilities. With the addition of these universities, irradiation can occur in multiple reactors and post-irradiation exams can be performed at multiple universities. In each case, the choice of facilities is based on the user's technical needs. Universities and laboratories included in the ATR NSUF partnership program are as follows: (1) Nuclear Services Laboratories at North Carolina State University; (2) PULSTAR Reactor Facility at North Carolina State University; (3) Michigan Ion Beam Laboratory (1.7 MV Tandetron accelerator) at the University of Michigan; (4) Irradiated Materials at the University of Michigan; (5) Harry Reid Center Radiochemistry Laboratories at University of Nevada, Las Vegas; (6) Characterization Laboratory for Irradiated Materials at the University of Wisconsin-Madison; (7) Tandem Accelerator Ion Beam. (1.7 MV terminal voltage tandem ion accelerator) at the University of Wisconsin

  14. Advanced functionality for radio analysis in the Offline software framework of the Pierre Auger Observatory

    Czech Academy of Sciences Publication Activity Database

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Kárová, Tatiana; Mandát, Dušan; Nečesal, Petr; Nožka, Libor; Nyklíček, Michal; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovancová, Jaroslava; Schovánek, Petr; Šmída, Radomír; Trávníček, Petr

    2011-01-01

    Roč. 635, č. 1 (2011), s. 92-102 ISSN 0168-9002 R&D Projects: GA MŠk LC527; GA MŠk(CZ) 1M06002; GA MŠk(CZ) LA08016; GA AV ČR KJB100100904; GA AV ČR KJB300100801 Institutional research plan: CEZ:AV0Z10100502; CEZ:AV0Z10100522 Keywords : cosmic rays * radio detection * analysis software * detector simulation Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.207, year: 2011

  15. Software Systems 2--Compiler and Operating Systems Lab--Advanced, Data Processing Technology: 8025.33.

    Science.gov (United States)

    Dade County Public Schools, Miami, FL.

    The course outline has been prepared as a guide to help the student develop the skills and knowledge necessary to succeed in the field of data processing. By learning the purpose and principles of compiler programs and operating systems, the student will become familiar with advanced data processing procedures that are representative of computer…

  16. UTILITY ADVANCED TURBINE SYSTEMS (ATS) TECHNOLOGY READINESS TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Unknown

    1999-10-01

    The overall objective of the Advanced Turbine System (ATS) Phase 3 Cooperative Agreement between GE and the U.S. Department of Energy (DOE) is the development of a highly efficient, environmentally superior, and cost-competitive utility ATS for base-load utility-scale power generation, the GE 7H (60 Hz) combined cycle power system, and related 9H (50 Hz) common technology. The major effort will be expended on detail design. Validation of critical components and technologies will be performed, including: hot gas path component testing, sub-scale compressor testing, steam purity test trials, and rotational heat transfer confirmation testing. Processes will be developed to support the manufacture of the first system, which was to have been sited and operated in Phase 4 but will now be sited and operated commercially by GE. This change has resulted from DOE's request to GE for deletion of Phase 4 in favor of a restructured Phase 3 (as Phase 3R) to include full speed, no load (FSNL) testing of the 7H gas turbine. Technology enhancements that are not required for the first machine design but will be critical for future ATS advances in performance, reliability, and costs will be initiated. Long-term tests of materials to confirm design life predictions will continue. A schematic of the GE H machine is shown in Figure 1-1. Information specifically related to 9H production is presented for continuity in H program reporting, but lies outside the ATS program. This report summarizes work accomplished from 4Q98 through 3Q99. The most significant accomplishments are listed.

  17. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  18. Modding a free and open source software video game: "Play testing is hard work"

    Directory of Open Access Journals (Sweden)

    Giacomo Poderi

    2014-03-01

    Full Text Available Video game modding is a form of fan productivity in contemporary participatory culture. We see modding as an important way in which modders experience and conceptualize their work. By focusing on modding in a free and open source software video game, we analyze the practice of modding and the way it changes modders' relationship with their object of interest. The modders' involvement is not always associated with fun and creativity. Indeed, activities such as play testing often undermine these dimensions of modding. We present a case study of modding that is based on ethnographic research done for The Battle for Wesnoth, a free and open source software strategy video game entirely developed by a community of volunteers.

  19. Testing aspects of advanced coherent electron cooling technique

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.; Jing, Y.; Pinayev, I.; Wang, G.; Samulyak, R.; Ratner, D.

    2015-05-03

    An advanced version of the Coherent-electron Cooling (CeC) based on the micro-bunching instability was proposed. This approach promises significant increase in the bandwidth of the CeC system and, therefore, significant shortening of cooling time in high-energy hadron colliders. In this paper we present our plans of simulating and testing the key aspects of this proposed technique using the set-up of the coherent-electron-cooling proof-of-principle experiment at BNL.

  20. Seismically induced accident sequence analysis of the advanced test reactor

    International Nuclear Information System (INIS)

    Khericha, S.T.; Henry, D.M.; Ravindra, M.K.; Hashimoto, P.S.; Griffin, M.J.; Tong, W.H.; Nafday, A.M.

    1991-01-01

    A seismic probabilistic risk assessment (PRA) was performed for the Department of Energy (DOE) Advanced Test Reactor (ATR) as part of the external events analysis. The risk from seismic events to the fuel in the core and in the fuel storage canal was evaluated. The key elements of this paper are the integration of seismically induced internal flood and internal fire, and the modeling of human error rates as a function of the magnitude of earthquake. The systems analysis was performed by EG ampersand G Idaho, Inc. and the fragility analysis and quantification were performed by EQE International, Inc. (EQE)

  1. Astrochronologic Testing in Deep-Time Strata: Historical Overview and Recent Advances

    Science.gov (United States)

    Meyers, S. R.

    2014-12-01

    The quest for astronomical-climate rhythms ("Milankovitch cycles") in Phanerozoic strata is now commonplace, and has yielded fundamental advancements in our understanding of climate change, paleoceanography, astrodynamics, geochronology and chronostratigraphy. Of central importance to this success has been the development of astrochronologic testing methods for the evaluation of astronomical-climate influence on sedimentation; this can be especially challenging for deep-time strata that lack sufficient independent time control (e.g., radioisotopic data) to unambiguously calibrate observed spatial rhythms to temporal periods. Most deep-time (pre-Pleistocene) astrochronologic testing methods fall into one of two categories: (1) those that test for expected amplitude or frequency modulation imposed by an astronomical signal, or (2) those that test for bedding hierarchies ("frequency ratios") that are predicted by the dominant astronomical periods. In this talk, I discuss the historical context of these methods, recent advances that overcome subjective evaluation and circular reasoning, and their implementation in a new "open source" software package for astrochronology (Meyers, 2014, astrochron: An R Package for Astrochronology).

  2. Advances in Genetic Testing for Hereditary Cancer Syndromes.

    Science.gov (United States)

    Thomas, Ellen; Mohammed, Shehla

    2016-01-01

    The ability to identify genetic mutations causing an increased risk of cancer represents the first widespread example of personalised medicine, in which genetic information is used to inform patients of their cancer risks and direct an appropriate strategy to minimise those risks. Increasingly, an understanding of the genetic basis of many cancers also facilitates selection of the most effective therapeutic options. The technology underlying genetic testing has been revolutionised in the years since the completion of the Human Genome Project in 2001. This has advanced knowledge of the genetic factors underlying familial cancer risk, and has also improved genetic testing capacity allowing a larger number of patients to be tested for a constitutional cancer predisposition. To use these tests safely and effectively, they must be assessed for their ability to provide accurate and useful results, and be requested and interpreted by health professionals with an understanding of their strengths and limitations. Genetic testing is increasing in its scope and ambition with each year that passes, requiring a greater proportion of the healthcare workforce to acquire a working knowledge of genetics and genetic testing to manage their patients safely and sensitively.

  3. Automated sequence analysis and editing software for HIV drug resistance testing.

    Science.gov (United States)

    Struck, Daniel; Wallis, Carole L; Denisov, Gennady; Lambert, Christine; Servais, Jean-Yves; Viana, Raquel V; Letsoalo, Esrom; Bronze, Michelle; Aitken, Sue C; Schuurman, Rob; Stevens, Wendy; Schmit, Jean Claude; Rinke de Wit, Tobias; Perez Bercoff, Danielle

    2012-05-01

    Access to antiretroviral treatment in resource-limited-settings is inevitably paralleled by the emergence of HIV drug resistance. Monitoring treatment efficacy and HIV drugs resistance testing are therefore of increasing importance in resource-limited settings. Yet low-cost technologies and procedures suited to the particular context and constraints of such settings are still lacking. The ART-A (Affordable Resistance Testing for Africa) consortium brought together public and private partners to address this issue. To develop an automated sequence analysis and editing software to support high throughput automated sequencing. The ART-A Software was designed to automatically process and edit ABI chromatograms or FASTA files from HIV-1 isolates. The ART-A Software performs the basecalling, assigns quality values, aligns query sequences against a set reference, infers a consensus sequence, identifies the HIV type and subtype, translates the nucleotide sequence to amino acids and reports insertions/deletions, premature stop codons, ambiguities and mixed calls. The results can be automatically exported to Excel to identify mutations. Automated analysis was compared to manual analysis using a panel of 1624 PR-RT sequences generated in 3 different laboratories. Discrepancies between manual and automated sequence analysis were 0.69% at the nucleotide level and 0.57% at the amino acid level (668,047 AA analyzed), and discordances at major resistance mutations were recorded in 62 cases (4.83% of differences, 0.04% of all AA) for PR and 171 (6.18% of differences, 0.03% of all AA) cases for RT. The ART-A Software is a time-sparing tool for pre-analyzing HIV and viral quasispecies sequences in high throughput laboratories and highlighting positions requiring attention. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Test Protocols for Advanced Inverter Interoperability Functions – Main Document

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already

  5. Test Protocols for Advanced Inverter Interoperability Functions - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ralph, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated on grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as

  6. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    Science.gov (United States)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  7. Software design of an auto-testing system for CAMAC modules

    International Nuclear Information System (INIS)

    Zhang Hao; Xu Jiajun

    1999-01-01

    The author introduces one of software methods of how to get a PC graphic interface operation when a MS-DOS driver is the only possible copy for the device. Human-machine interactive graphic interface is more popular nowadays since it has rich pages and comfortable input-output operations. First step is to set a data exchange between MS-DOS and MS-Windows through a interrupt service. Second step is to program a dynamic link library which VB can invoke. VBX control is possible to extend the system functions. The testing system can test main performances automatically with CAMAC modules of IDIM, IDOM, PSC, SAM, and 3016. It seems the better way to test the linearity, AC correction and so on. The testing system proves to be usable in maintaining CAMAC modules of BEPC control system

  8. Software design of a auto-testing system for CAMAC modules

    International Nuclear Information System (INIS)

    Zhang Hao; Xu Jiajun

    1997-01-01

    The author introduces one of software methods of how to get a PC graphic interface operation when a MS-DOS driver is the only possible copy for the device. Human-machine interactive graphic interface is more popular nowadays since it has rich pages and comfortable input/output operations. First step is to set up a data exchange between MS-DOS and MS-Windows through a interrupt service. Second step is to program a dynamic link library which VB can invoke. VBX control is possible to extend the system functions. The testing system can test main performances automatically with CAMAC modules of IDIM, IDOM, PSC, SAM, and 3016. It seems the better way to test the linearity, AC correction and so on. The testing system is proved to be useable in maintaining CAMAC modules of BEPC control system

  9. Quality Assurance Protocol for AFCI Advanced Structural Materials Testing

    International Nuclear Information System (INIS)

    Busby, Jeremy T.

    2009-01-01

    The objective of this letter is to inform you of recent progress on the development of advanced structural materials in support of advanced fast reactors and AFCI. As you know, the alloy development effort has been initiated in recent months with the procurement of adequate quantities of the NF616 and HT-UPS alloys. As the test alloys become available in the coming days, mechanical testing, evaluation of optimizing treatments, and screening of environmental effects will be possible at a larger scale. It is therefore important to establish proper quality assurance protocols for this testing effort in a timely manner to ensure high technical quality throughout testing. A properly implemented quality assurance effort will also enable preliminary data taken in this effort to be qualified as NQA-1 during any subsequent licensing discussions for an advanced design or actual prototype. The objective of this report is to describe the quality assurance protocols that will be used for this effort. An essential first step in evaluating quality protocols is assessing the end use of the data. Currently, the advanced structural materials effort is part of a long-range, basic research and development effort and not, as yet, involved in licensing discussions for a specific reactor design. After consultation with Mark Vance (an ORNL QA expert) and based on the recently-issued AFCI QA requirements, the application of NQA-1 quality requirements will follow the guidance provided in Part IV, Subpart 4.2 of the NQA-1 standard (Guidance on Graded Application of QA for Nuclear-Related Research and Development). This guidance mandates the application of sound scientific methodology and a robust peer review process in all phases, allowing for the data to be qualified for use even if the programmatic mission changes to include licensing discussions of a specific design or prototype. ORNL has previously implemented a QA program dedicated to GNEP activities and based on an appropriately graded

  10. Quality Assurance Protocol for AFCI Advanced Structural Materials Testing

    Energy Technology Data Exchange (ETDEWEB)

    Busby, Jeremy T [ORNL

    2009-05-01

    The objective of this letter is to inform you of recent progress on the development of advanced structural materials in support of advanced fast reactors and AFCI. As you know, the alloy development effort has been initiated in recent months with the procurement of adequate quantities of the NF616 and HT-UPS alloys. As the test alloys become available in the coming days, mechanical testing, evaluation of optimizing treatments, and screening of environmental effects will be possible at a larger scale. It is therefore important to establish proper quality assurance protocols for this testing effort in a timely manner to ensure high technical quality throughout testing. A properly implemented quality assurance effort will also enable preliminary data taken in this effort to be qualified as NQA-1 during any subsequent licensing discussions for an advanced design or actual prototype. The objective of this report is to describe the quality assurance protocols that will be used for this effort. An essential first step in evaluating quality protocols is assessing the end use of the data. Currently, the advanced structural materials effort is part of a long-range, basic research and development effort and not, as yet, involved in licensing discussions for a specific reactor design. After consultation with Mark Vance (an ORNL QA expert) and based on the recently-issued AFCI QA requirements, the application of NQA-1 quality requirements will follow the guidance provided in Part IV, Subpart 4.2 of the NQA-1 standard (Guidance on Graded Application of QA for Nuclear-Related Research and Development). This guidance mandates the application of sound scientific methodology and a robust peer review process in all phases, allowing for the data to be qualified for use even if the programmatic mission changes to include licensing discussions of a specific design or prototype. ORNL has previously implemented a QA program dedicated to GNEP activities and based on an appropriately graded

  11. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  12. Data collection and analysis software development for rotor dynamics testing in spin laboratory

    Science.gov (United States)

    Abdul-Aziz, Ali; Arble, Daniel; Woike, Mark

    2017-04-01

    Gas turbine engine components undergo high rotational loading another complex environmental conditions. Such operating environment leads these components to experience damages and cracks that can cause catastrophic failure during flights. There are traditional crack detections and health monitoring methodologies currently being used which rely on periodic routine maintenances, nondestructive inspections that often times involve engine and components dis-assemblies. These methods do not also offer adequate information about the faults, especially, if these faults at subsurface or not clearly evident. At NASA Glenn research center, the rotor dynamics laboratory is presently involved in developing newer techniques that are highly dependent on sensor technology to enable health monitoring and prediction of damage and cracks in rotor disks. These approaches are noninvasive and relatively economical. Spin tests are performed using a subscale test article mimicking turbine rotor disk undergoing rotational load. Non-contact instruments such as capacitive and microwave sensors are used to measure the blade tip gap displacement and blade vibrations characteristics in an attempt develop a physics based model to assess/predict the faults in the rotor disk. Data collection is a major component in this experimental-analytical procedure and as a result, an upgrade to an older version of the data acquisition software which is based on LabVIEW program has been implemented to support efficiently running tests and analyze the results. Outcomes obtained from the tests data and related experimental and analytical rotor dynamics modeling including key features of the updated software are presented and discussed.

  13. Testing, installation and development of hardware and software components for the forward pixel detector of CMS

    CERN Document Server

    Florez Bustos, Carlos Andres

    2007-01-01

    The LHC (Large Hadron Collider) will be the particle accelerator with the highest collision energy ever. CMS (Compact Muon Solenoid) is one of the two largest experiments at the LHC. A main goal of CMS is to elucidate the electroweak symmetry breaking and determine if the Higgs mechanism is responsible for it. The pixel detector in CMS is the closest detector to the interaction point and is part of the tracker system. This thesis presents four different projects related to the forward pixel detector, performed as part of the testing and development of its hardware and software components. It presents the methods, implementation and results for the data acquisition and installation of the detector control system at the Meson Test Beam Facility of Fermilab for the beam test of the detector; the study of the C.A.E.N power supply and the multi service cable; the layout of the test stands for the assembly of the half-disk and half-service cylinder and the development of a software interface to the data acquisition...

  14. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  15. Safety Assurance for Irradiating Experiments in the Advanced Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    T. A. Tomberlin; S. B. Grover

    2004-11-01

    The Advanced Test Reactor (ATR), located at the Idaho National Engineering and Environmental Laboratory (INEEL), was specifically designed to provide a high neutron flux test environment for conducting a variety of experiments. This paper addresses the safety assurance process for two general types of experiments conducted in the ATR facility and how the safety analyses for experiments are related to the ATR safety basis. One type of experiment is more routine and generally represents greater risks; therefore, this type of experiment is addressed in more detail in the ATR safety basis. This allows the individual safety analysis for this type of experiment to be more standardized. The second type of experiment is defined in more general terms in the ATR safety basis and is permitted under more general controls. Therefore, the individual safety analysis for the second type of experiment tends to be more unique and is tailored to each experiment.

  16. Safety Assurance for Irradiating Experiments in the Advanced Test Reactor

    International Nuclear Information System (INIS)

    T. A. Tomberlin; S. B. Grover

    2004-01-01

    The Advanced Test Reactor (ATR), located at the Idaho National Engineering and Environmental Laboratory (INEEL), was specifically designed to provide a high neutron flux test environment for conducting a variety of experiments. This paper addresses the safety assurance process for two general types of experiments conducted in the ATR facility and how the safety analyses for experiments are related to the ATR safety basis. One type of experiment is more routine and generally represents greater risks; therefore, this type of experiment is addressed in more detail in the ATR safety basis. This allows the individual safety analysis for this type of experiment to be more standardized. The second type of experiment is defined in more general terms in the ATR safety basis and is permitted under more general controls. Therefore, the individual safety analysis for the second type of experiment tends to be more unique and is tailored to each experiment

  17. Advances in p-Value Based Multiple Test Procedures.

    Science.gov (United States)

    Tamhane, Ajit C; Gou, Jiangtao

    2018-01-01

    In this article we review recent advances in [Formula: see text]-value-based multiple test procedures (MTPs). We begin with a brief review of the basic tests of Bonferroni and Simes. Standard stepwise MTPs derived from them using the closure method of Marcus et al. (1976) are discussed next. They include the well-known MTPs of Holm (1979), Hochberg (1988) and Hommel (1988), and their extensions and improvements. This is followed by stepwise MTPs for a priori ordered hypotheses. Next we present gatekeeping MTPs (Dmitrienko and Tamhane, 2007) for hierarchically ordered families of hypotheses with logical relations among them. Finally, we give a brief review of the graphical approach (Bretz et al., 2009) to constructing and visualizing gatekeeping and other MTPs. Simple numerical examples are given to illustrate the various procedures.

  18. Testing of Alternative Materials for Advanced Suit Bladders

    Science.gov (United States)

    Bue, Grant; Orndoff, Evelyne; Makinen, Janice; Tang, Henry

    2011-01-01

    Several candidate advanced pressure bladder membrane materials have been developed for NASA Johnson Space Center by DSM Biomedical for selective permeability of carbon dioxide and water vapor. These materials were elasthane and two other formulations of thermoplastic polyether polyurethane. Each material was tested in two thicknesses for permeability to carbon dioxide, oxygen and water vapor. Although oxygen leaks through the suit bladder would amount to only about 60 cc/hr in a full size suit, significant amounts of carbon dioxide would not be rejected by the system to justify its use. While the ratio of carbon dioxide to oxygen permeability is about 48 to 1, this is offset by the small partial pressure of carbon dioxide in acceptable breathing atmospheres of the suit. Humidity management remains a possible use of the membranes depending on the degree to which the water permeability is inhibited by cations in the sweat. Tests are underway to explore cation fouling from sweat.

  19. MCNP full-core modeling of the advanced test reactor

    International Nuclear Information System (INIS)

    Kim, S.S.; Jahshan, S.N.; Nielson, R.B.

    1993-01-01

    A full-core Monte Carlo neutron and photon (MCNP) transport model has been completed for the advanced test reactor (ATR) at Idaho National Engineering Laboratory. This new model is a complete three-dimensional model that represents fuel elements, core structures, and target regions in adequate detail. The model can be used in evaluating heating and reaction rates in various target regions of the core. This model is especially useful in physics analysis of an asymmetric experiment loading in the core. The ATR is a light-water-cooled thermal reactor with aluminum/uranium-aluminide fuel plates grouped in arcuate fuel elements that form a serpentine arrangement, as shown in Fig. 1. The core is surrounded by a beryllium reflector. Nine test loops are nestled in the lobes of the serpentine core, and numerous other irradiation holes with varying dimensions and radiation environments are located in the reflector and in the core interior

  20. Performance Evaluation and Robustness Testing of Advanced Oscilloscope Triggering Schemes

    Directory of Open Access Journals (Sweden)

    Shakeb A. KHAN

    2010-01-01

    Full Text Available In this paper, performance and robustness of two advanced oscilloscope triggering schemes is evaluated. The problem of time period measurement of complex waveforms can be solved using the algorithms, which utilize the associative memory network based weighted hamming distance (Whd and autocorrelation based techniques. Robustness of both the advanced techniques, are then evaluated by simulated addition of random noise of different levels to complex test signals waveforms, and minimum value of Whd (Whd min and peak value of coefficient of correlation(COCmax are computed over 10000 cycles of the selected test waveforms. The distance between mean value of second lowest value of Whd and Whd min and distance between second highest value of coefficient of correlation (COC and COC max are used as parameters to analyze the robustness of considered techniques. From the results, it is found that both the techniques are capable of producing trigger pulses efficiently; but correlation based technique is found to be better from robustness point of view.