WorldWideScience

Sample records for performance verification program

  1. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  2. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  3. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  4. Performance characteristics of an independent dose verification program for helical tomotherapy

    Directory of Open Access Journals (Sweden)

    Isaac C. F. Chang

    2017-01-01

    Full Text Available Helical tomotherapy with its advanced method of intensity-modulated radiation therapy delivery has been used clinically for over 20 years. The standard delivery quality assurance procedure to measure the accuracy of delivered radiation dose from each treatment plan to a phantom is time-consuming. RadCalc®, a radiotherapy dose verification software, has released specifically for beta testing a module for tomotherapy plan dose calculations. RadCalc®'s accuracy for tomotherapy dose calculations was evaluated through examination of point doses in ten lung and ten prostate clinical plans. Doses calculated by the TomoHDA™ tomotherapy treatment planning system were used as the baseline. For lung cases, RadCalc® overestimated point doses in the lung by an average of 13%. Doses within the spinal cord and esophagus were overestimated by 10%. Prostate plans showed better agreement, with overestimations of 6% in the prostate, bladder, and rectum. The systematic overestimation likely resulted from limitations of the pencil beam dose calculation algorithm implemented by RadCalc®. Limitations were more severe in areas of greater inhomogeneity and less prominent in regions of homogeneity with densities closer to 1 g/cm3. Recommendations for RadCalc® dose calculation algorithms and anatomical representation were provided based on the results of the study.

  5. Performance Testing of Homeland Security Technologies in U.S. EPA's Environmental Technology Verification (ETV) Program

    National Research Council Canada - National Science Library

    Kelly, Thomas J; Hofacre, Kent C; Derringer, Tricia L; Riggs, Karen B; Koglin, Eric N

    2004-01-01

    ... (reports and test plans available at www.epa.gov/etv). In the aftermath of the terrorist attacks of September 11, 2001, the ETV approach has also been employed in performance tests of technologies relevant to homeland security (HS...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: GREEN BUILDING TECHNOLOGIES

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  7. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  8. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  9. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  10. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  11. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  12. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  13. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  14. Verification of business rules programs

    CERN Document Server

    Silva, Bruno Berstel-Da

    2013-01-01

    Rules represent a simplified means of programming, congruent with our understanding of human brain constructs. With the advent of business rules management systems, it has been possible to introduce rule-based programming to nonprogrammers, allowing them to map expert intent into code in applications such as fraud detection, financial transactions, healthcare, retail, and marketing. However, a remaining concern is the quality, safety, and reliability of the resulting programs.  This book is on business rules programs, that is, rule programs as handled in business rules management systems. Its

  15. Runtime Verification of C Programs

    Science.gov (United States)

    Havelund, Klaus

    2008-01-01

    We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.

  16. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  17. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  18. U.S. ENVIRONMENTAL PROTECTION AGENCY (EPA) ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: ARSENIC MONITORING TECHNOLOGIES

    Science.gov (United States)

    The U.S. Environmental Protection Agency Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This technology ...

  19. U.S. ENVIRONMENTAL PROTECTION AGENCY (EPA) ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: ARSENIC TREATMENT TECHNOLOGIES

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  20. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: PROTOCOL FOR THE VERIFICATION OF GROUTING MATERIALS FOR INFRASTRUCTURE REHABILITATION AT THE UNIVERSITY OF HOUSTON - CIGMAT

    Science.gov (United States)

    This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...

  2. Commitment to COT verification improves patient outcomes and financial performance.

    Science.gov (United States)

    Maggio, Paul M; Brundage, Susan I; Hernandez-Boussard, Tina; Spain, David A

    2009-07-01

    After an unsuccessful American College of Surgery Committee on Trauma visit, our level I trauma center initiated an improvement program that included (1) hiring new personnel (trauma director and surgeons, nurse coordinator, orthopedic trauma surgeon, and registry staff), (2) correcting deficiencies in trauma quality assurance and process improvement programs, and (3) development of an outreach program. Subsequently, our trauma center had two successful verifications. We examined the longitudinal effects of these efforts on volume, patient outcomes and finances. The Trauma Registry was used to derive data for all trauma patients evaluated in the emergency department from 2001 to 2007. Clinical data analyzed included number of admissions, interfacility transfers, injury severity scores (ISS), length of stay, and mortality for 2001 to 2007. Financial performance was assessed for fiscal years 2001 to 2007. Data were divided into patients discharged from the emergency department and those admitted to the hospital. Admissions increased 30%, representing a 7.6% annual increase (p = 0.004), mostly due to a nearly fivefold increase in interfacility transfers. Severe trauma patients (ISS >24) increased 106% and mortality rate for ISS >24 decreased by 47% to almost half the average of the National Trauma Database. There was a 78% increase in revenue and a sustained increase in hospital profitability. A major hospital commitment to Committee on Trauma verification had several salient outcomes; increased admissions, interfacility transfers, and acuity. Despite more seriously injured patients, there has been a major, sustained reduction in mortality and a trend toward decreased intensive care unit length of stay. This resulted in a substantial increase in contribution to margin (CTM), net profit, and revenues. With a high level of commitment and favorable payer mix, trauma center verification improves outcomes for both patients and the hospital.

  3. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    We present ParTypes, a type-based methodology for the verification of Message Passing Interface (MPI) programs written in the C programming language. The aim is to statically verify programs against protocol specifications, enforcing properties such as fidelity and absence of deadlocks. We develo...

  4. Verification of Imperative Programs by Constraint Logic Program Transformation

    Directory of Open Access Journals (Sweden)

    Emanuele De Angelis

    2013-09-01

    Full Text Available We present a method for verifying partial correctness properties of imperative programs that manipulate integers and arrays by using techniques based on the transformation of constraint logic programs (CLP. We use CLP as a metalanguage for representing imperative programs, their executions, and their properties. First, we encode the correctness of an imperative program, say prog, as the negation of a predicate 'incorrect' defined by a CLP program T. By construction, 'incorrect' holds in the least model of T if and only if the execution of prog from an initial configuration eventually halts in an error configuration. Then, we apply to program T a sequence of transformations that preserve its least model semantics. These transformations are based on well-known transformation rules, such as unfolding and folding, guided by suitable transformation strategies, such as specialization and generalization. The objective of the transformations is to derive a new CLP program TransfT where the predicate 'incorrect' is defined either by (i the fact 'incorrect.' (and in this case prog is not correct, or by (ii the empty set of clauses (and in this case prog is correct. In the case where we derive a CLP program such that neither (i nor (ii holds, we iterate the transformation. Since the problem is undecidable, this process may not terminate. We show through examples that our method can be applied in a rather systematic way, and is amenable to automation by transferring to the field of program verification many techniques developed in the field of program transformation.

  5. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  6. Electric and hybrid vehicle self-certification and verification procedures: Market Demonstration Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-01

    The process by which a manufacturer of an electric or hybrid vehicle certifies that his vehicle meets the DOE Performance Standards for Demonstration is described. Such certification is required for any vehicles to be purchased under the Market Demonstration Program. It also explains the verification testing process followed by DOE for testing to verify compliance. Finally, the document outlines manufacturer responsibilities and presents procedures for recertification of vehicles that have failed verification testing.

  7. On Construction and Verification of PLC-Programs

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2012-01-01

    Full Text Available We review some methods and approaches to programming discrete problems for Programmable Logic Controllers on the example of constructing PLC-programs for controling a code lock. For these approaches we evaluate the usability of the model checking method for the analysis of program correctness with respect to the automatic verification tool Cadence SMV. Some possible PLC-program vulnerabilities arising at a number approaches to programming of PLC are revealed.

  8. Verification and Planning Based on Coinductive Logic Programming

    Science.gov (United States)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  9. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  10. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  11. Effective verification of confidentiality for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Minh Tri; Stoelinga, Mariëlle Ida Antoinette; Huisman, Marieke

    2014-01-01

    This paper studies how confidentiality properties of multi-threaded programs can be verified efficiently by a combination of newly developed and existing model checking algorithms. In particular, we study the verification of scheduler-specific observational determinism (SSOD), a property that

  12. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  13. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  14. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  15. Probabilistic Programming : A True Verification Challenge

    NARCIS (Netherlands)

    Katoen, Joost P.; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun

    2015-01-01

    Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a

  16. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    applications, this implementation forms the basis of a verification technique for imperative programs that perform data-type operations using pointers. To achieve this, the basic logic is extended with layers of language abstractions. Also, a language for expressing data structures and operations along...

  17. Formal Specification and Verification of Concurrent Programs

    Science.gov (United States)

    1993-02-01

    of examples from the emerging theory of This book describes operating systems in general programming languages. via the construction of MINIX , a UNIX...look-alike that runs on IBM-PC compatibles. The book con- Wegner72 tains a complete MINIX manual and a complete Wegnerflisting of its C codie. egner

  18. Constraint-based verification of imperative programs

    OpenAIRE

    Beyene, Tewodros Awgichew

    2011-01-01

    work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic The continuous reduction in the cost of computing ever since the first days of computers has resulted in the ubiquity of computing systems today; there is no any sphere of life in the daily routine of human beings that is not directly or indirectly influenced by computer systems anymore. But this high reliance ...

  19. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  20. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  1. Data verification and evaluation techniques for groundwater monitoring programs

    International Nuclear Information System (INIS)

    Mercier, T.M.; Turner, R.R.

    1990-12-01

    To ensure that data resulting from groundwater monitoring programs are of the quality required to fulfill program objectives, it is suggested that a program of data verification and evaluation be implemented. These procedures are meant to supplement and support the existing laboratory quality control/quality assurance programs by identifying aberrant data resulting from a variety of unforeseen circumstances: sampling problems, data transformations in the lab, data input at the lab, data transfer, end-user data input. Using common-sense principles, pattern recognition techniques, and hydrogeological principles, a computer program was written which scans the data for suspected abnormalities and produces a text file stating sample identifiers, the suspect data, and a statement of how the data has departed from the expected. The techniques described in this paper have been developed to support the Y-12 Plant Groundwater Protection Program Management Plan

  2. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  3. Assertion checking environment (ACE) for formal verification of C programs

    International Nuclear Information System (INIS)

    Sharma, Babita; Dhodapkar, S.D.; Ramesh, S.

    2003-01-01

    In this paper we describe an Assertion Checking Environment (ACE) for compositional verification of programs, which are written in an industrially sponsored safe subset of C programming language called MISRA C [Guidelines for the Use of the C Language in Vehicle Based Software, 1998]. The theory is based on Hoare logic [Commun. ACM 12 (1969) 576] and the C programs are verified using static assertion checking technique. First the functional specifications of the program, captured in the form of pre- and post-conditions for each C function, are derived from the specifications. These pre- and post-conditions are then introduced as assertions (also called annotations or formal comments) in the program code. The assertions are then proved formally using ACE and theorem proving tool called Stanford Temporal Prover [The Stanford Temporal Prover User's Manual, 1998]. ACE has been developed by us and consists mainly of a translator c2spl, a GUI and some utility programs. The technique and tools developed are targeted towards verification of real-time embedded software

  4. Transforming PLC Programs into Formal Models for Verification Purposes

    CERN Document Server

    Darvas, D; Blanco, E

    2013-01-01

    Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.

  5. Performance verification of 3D printers

    OpenAIRE

    Hansen, Hans Nørgaard; Nielsen, Jakob Skov; Rasmussen, Jakob; Pedersen, David Bue

    2014-01-01

    Additive Manufacturing continues to gain momentum as the next industrial revolution. While these layering technologies have demonstrated significant time and cost savings for prototype efforts, and enabled new designs with performance benefits, additive manufacturing has not been affiliated with 'precision' applications. In order to understand additive manufacturing's capabilities or short comings with regard to precision applications, it is important to understand the mechanics of the proces...

  6. Performance verification of 3D printers

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Nielsen, Jakob Skov; Rasmussen, Jakob

    2014-01-01

    Additive Manufacturing continues to gain momentum as the next industrial revolution. While these layering technologies have demonstrated significant time and cost savings for prototype efforts, and enabled new designs with performance benefits, additive manufacturing has not been affiliated....... This paper and presentation will take a deep dive into the hardware and mechanics of the modern-day DMLM machine from three of the largest equipment manufacturers. We will also look at typical post processes including the heat treats that are commonly applied to DMLM metal parts. Along the way, we'll mention...... with 'precision' applications. In order to understand additive manufacturing's capabilities or short comings with regard to precision applications, it is important to understand the mechanics of the process. GE Aviation's Additive Development Center [ADC] is in a unique position to comment on additive metal...

  7. Performance Verification for Safety Injection Tank with Fluidic Device

    International Nuclear Information System (INIS)

    Yune, Seok Jeong; Kim, Da Yong

    2014-01-01

    In LBLOCA, the SITs of a conventional nuclear power plant deliver excessive cooling water to the reactor vessel causing the water to flow into the containment atmosphere. In an effort to make it more efficient, Fluidic Device (FD) is installed inside a SIT of Advanced Power Reactor 1400 (APR 1400). FD, a complete passive controller which doesn't require actuating power, controls injection flow rates which are susceptible to a change in the flow resistance inside a vortex chamber of FD. When SIT Emergency Core Cooling (ECC) water level is above the top of the stand pipe, the water enters the vortex chamber through both the top of the stand pipe and the control ports resulting in injection of the water at a large flow rate. When the water level drops below the top of the stand pipe, the water only enters the vortex chamber through the control ports resulting in vortex formation in the vortex chamber and a relatively small flow injection. Performance verification of SIT shall be carried out because SITs play an integral role to mitigate accidents. In this paper, the performance verification method of SIT with FD is presented. In this paper, the equations for calculation of flow resistance coefficient (K) are induced to evaluate on-site performance of APR 1400 SIT with FD. Then, the equations are applied to the performance verification of SIT with FD and good results are obtained

  8. Practical Formal Verification of MPI and Thread Programs

    Science.gov (United States)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  9. DOE-EPRI distributed wind Turbine Verification Program (TVP III)

    Energy Technology Data Exchange (ETDEWEB)

    McGowin, C.; DeMeo, E. [Electric Power Research Institute, Palo Alto, CA (United States); Calvert, S. [Dept. of Energy, Washington, DC (United States)] [and others

    1997-12-31

    In 1992, the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE) initiated the Utility Wind Turbine Verification Program (TVP). The goal of the program is to evaluate prototype advanced wind turbines at several sites developed by U.S. electric utility companies. Two six MW wind projects have been installed under the TVP program by Central and South West Services in Fort Davis, Texas and Green Mountain Power Corporation in Searsburg, Vermont. In early 1997, DOE and EPRI selected five more utility projects to evaluate distributed wind generation using smaller {open_quotes}clusters{close_quotes} of wind turbines connected directly to the electricity distribution system. This paper presents an overview of the objectives, scope, and status of the EPRI-DOE TVP program and the existing and planned TVP projects.

  10. Field Verification Program for Small Wind Turbines, Quartelry Report: 2nd Quarter, Issue No.1, October 2000

    Energy Technology Data Exchange (ETDEWEB)

    Tu, P.; Forsyth, T.

    2000-11-02

    The Field Verification Program for Small Wind Turbines quarterly report provides industry members with a description of the program, its mission, and purpose. It also provides a vehicle for participants to report performance data, activities, and issues during quarterly test periods.

  11. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  12. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  13. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  14. Proceedings Fifth International Workshop on Verification and Program Transformation

    OpenAIRE

    Lisitsa, Alexei; Nemytykh, Andrei P.; Proietti, Maurizio

    2017-01-01

    We extend a technique called Compiling Control. The technique transforms coroutining logic programs into logic programs that, when executed under the standard left-to-right selection rule (and not using any delay features) have the same computational behavior as the coroutining program. In recent work, we revised Compiling Control and reformulated it as an instance of Abstract Conjunctive Partial Deduction. This work was mostly focused on the program analysis performed in Compiling Control. I...

  15. Quality verification at Arkansas Nuclear One using performance-based concepts

    International Nuclear Information System (INIS)

    Cooper, R.M.

    1990-01-01

    Performance-based auditing is beginning to make an impact within the nuclear industry. Its use provides performance assessments of the operating plant. In the past, this company along with most other nuclear utilities, performed compliance-based audits. These audits focused on paper reviews of past activities that were completed in weeks or months. This type of audit did not provide a comprehensive assessment of the effectiveness of an activity's performance, nor was it able to identify any performance problems that may have occurred. To respond to this discrepancy, a comprehensive overhaul of quality assurance (QA) assessment programs was developed. The first major change was to develop a technical specification (tech spec) audit program, with the objective of auditing each tech spec line item every 5 yr. To achieve performance-based results within the tech spec audit program, a tech spec surveillance program was implemented whose goal is to observe 75% of the tech-spec required tests every 5 yr. The next major change was to develop a QA surveillance program that would provide surveillance coverage for the remainder of the plant not covered by the tech spec surveillance program. One other improvement was to merge the QA/quality control (QC) functions into one nuclear quality group. The final part of the quality verification effort is trending of the quality performance-based data (including US Nuclear Regulatory Commission (NRC) violations)

  16. R high performance programming

    CERN Document Server

    Lim, Aloysius

    2015-01-01

    This book is for programmers and developers who want to improve the performance of their R programs by making them run faster with large data sets or who are trying to solve a pesky performance problem.

  17. Verification Test of Hydraulic Performance for Reactor Coolant Pump

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jun; Kim, Jae Shin; Ryu, In Wan; Ko, Bok Seong; Song, Keun Myung [Samjin Ind. Co., Seoul (Korea, Republic of)

    2010-01-15

    According to this project, basic design for prototype pump and model pump of reactor coolant pump and test facilities has been completed. Basic design for prototype pump to establish structure, dimension and hydraulic performance has been completed and through primary flow analysis by computational fluid dynamics(CFD), flow characteristics and hydraulic performance have been established. This pump was designed with mixed flow pump having the following design requirements; specific velocity(Ns); 1080.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 3115m{sup 3}/h, total head ; 26.3m, pump speed; 1710rpm, pump efficiency; 77.0%, Impeller out-diameter; 349mm, motor output; 360kw, design pressure; 17MPaG. The features of the pump are leakage free due to no mechanical seal on the pump shaft which insures reactor's safety and law noise level and low vibration due to no cooling fan on the motor which makes eco-friendly product. Model pump size was reduced to 44% of prototype pump for the verification test for hydraulic performance of reactor coolant pump and was designed with mixed flow pump and canned motor having the following design requirements; specific speed(NS); 1060.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 539.4m{sup 3}/h, total head; 21.0m, pump speed; 3476rpm, pump efficiency; 72.9%, Impeller out-diameter; 154mm, motor output; 55kw, design pressure; 1.0MPaG. The test facilities were designed for verification test of hydraulic performance suitable for pump performance test, homologous test, NPSH test(cavitation), cost down test and pressure pulsation test of inlet and outlet ports. Test tank was designed with testing capacity enabling up to 2000m{sup 3}/h and design pressure 1.0MPaG. Auxiliary pump was designed with centrifugal pump having capacity; 1100m{sup 3}/h, total head; 42.0m, motor output; 190kw

  18. Translation of PLC Programs to x86 for Simulation and Verification

    CERN Document Server

    Sallai, Gyula

    2017-01-01

    PLC programs are written in special languages, variants of the languages defined in the IEC 61131 standard. These programs cannot be directly executed on personal computers (on x86 architecture). To perform simulation of the PLC program or diagnostics during development, either a real PLC or a PLC simulator has to be used. However, these solutions are often inflexible and they do not provide appropriate performance. By generating x86-representations (semantically equivalent programs which can be executed on PCs, e.g. written in C, C++ or Java) of the PLC programs, some of these challenges could be met. PLCverif is a PLC program verification tool developed at CERN which includes a parser for Siemens PLC programs. In this work, we describe a code generator based on this parser of PLCverif. This work explores the possibilities and challenges of generating programs in widely-used general purpose languages from PLC programs, and provides a proof-of-concept code generation implementation. The presented solution dem...

  19. Characterization of Rocks and Grouts to Support DNA's Verification Program

    National Research Council Canada - National Science Library

    Martin, J

    2000-01-01

    .... Specifically, test data was generated for use with DNA's HYDROPLUS program. The tests performed included unconfined compression tests, uniaxial strain tests, physical properties, ultrasonic velocities, XRD mineralogy, and lithologic descriptions...

  20. A program for verification of phylogenetic network models.

    Science.gov (United States)

    Gunawan, Andreas D M; Lu, Bingxin; Zhang, Louxin

    2016-09-01

    Genetic material is transferred in a non-reproductive manner across species more frequently than commonly thought, particularly in the bacteria kingdom. On one hand, extant genomes are thus more properly considered as a fusion product of both reproductive and non-reproductive genetic transfers. This has motivated researchers to adopt phylogenetic networks to study genome evolution. On the other hand, a gene's evolution is usually tree-like and has been studied for over half a century. Accordingly, the relationships between phylogenetic trees and networks are the basis for the reconstruction and verification of phylogenetic networks. One important problem in verifying a network model is determining whether or not certain existing phylogenetic trees are displayed in a phylogenetic network. This problem is formally called the tree containment problem. It is NP-complete even for binary phylogenetic networks. We design an exponential time but efficient method for determining whether or not a phylogenetic tree is displayed in an arbitrary phylogenetic network. It is developed on the basis of the so-called reticulation-visible property of phylogenetic networks. A C-program is available for download on http://www.math.nus.edu.sg/∼matzlx/tcp_package matzlx@nus.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  2. Safety performance indicators program

    International Nuclear Information System (INIS)

    Vidal, Patricia G.

    2004-01-01

    In 1997 the Nuclear Regulatory Authority (ARN) initiated a program to define and implement a Safety Performance Indicators System for the two operating nuclear power plants, Atucha I and Embalse. The objective of the program was to incorporate a set of safety performance indicators to be used as a new regulatory tool providing an additional view of the operational performance of the nuclear power plants, improving the ability to detect degradation on safety related areas. A set of twenty-four safety performance indicators was developed and improved throughout pilot implementation initiated in July 1998. This paper summarises the program development, the main criteria applied in each stage and the results obtained. (author)

  3. Measurement and verification of low income energy efficiency programs in Brazil: Methodological challenges

    Energy Technology Data Exchange (ETDEWEB)

    Martino Jannuzzi, Gilberto De; Rodrigues da Silva, Ana Lucia; Melo, Conrado Augustus de; Paccola, Jose Angelo; Dourado Maia Gomes, Rodolfo (State Univ. of Campinas, International Energy Initiative (Brazil))

    2009-07-01

    Electric utilities in Brazil are investing about 80 million dollars annually in low-income energy efficiency programs, about half of their total compulsory investments in end-use efficiency programs under current regulation. Since 2007 the regulator has enforced the need to provide evaluation plans for the programs delivered. This paper presents the measurement and verification (MandV) methodology that has been developed to accommodate the characteristics of lighting and refrigerator programs that have been introduced in the Brazilian urban and peri-urban slums. A combination of household surveys, end-use measurements and metering at the transformers and grid levels were performed before and after the program implementation. The methodology has to accommodate the dynamics, housing, electrical wiring and connections of the population as well as their ability to pay for the electricity and program participation. Results obtained in slums in Rio de Janeiro are presented. Impacts of the programs were evaluated in energy terms to households and utilities. Feedback from the evaluations performed also permitted the improvement in the design of new programs for low-income households.

  4. SU-E-T-455: Impact of Different Independent Dose Verification Software Programs for Secondary Check

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Kosaka, M; Kobayashi, N; Yamashita, M; Ishibashi, S; Higuchi, Y; Tachibana, H

    2015-01-01

    Purpose: There have been many reports for different dose calculation algorithms for treatment planning system (TPS). Independent dose verification program (IndpPro) is essential to verify clinical plans from the TPS. However, the accuracy of different independent dose verification programs was not evident. We conducted a multi-institutional study to reveal the impact of different IndpPros using different TPSs. Methods: Three institutes participated in this study. They used two different IndpPros (RADCALC and Simple MU Analysis (SMU), which implemented the Clarkson algorithm. RADCALC needed the input of radiological path length (RPL) computed by the TPSs (Eclipse or Pinnacle3). SMU used CT images to compute the RPL independently from TPS). An ion-chamber measurement in water-equivalent phantom was performed to evaluate the accuracy of two IndpPros and the TPS in each institute. Next, the accuracy of dose calculation using the two IndpPros compared to TPS was assessed in clinical plan. Results: The accuracy of IndpPros and the TPSs in the homogenous phantom was +/−1% variation to the measurement. 1543 treatment fields were collected from the patients treated in the institutes. The RADCALC showed better accuracy (0.9 ± 2.2 %) than the SMU (1.7 ± 2.1 %). However, the accuracy was dependent on the TPS (Eclipse: 0.5%, Pinnacle3: 1.0%). The accuracy of RADCALC with Eclipse was similar to that of SMU in one of the institute. Conclusion: Depending on independent dose verification program, the accuracy shows systematic dose accuracy variation even though the measurement comparison showed a similar variation. The variation was affected by radiological path length calculation. IndpPro with Pinnacle3 has different variation because Pinnacle3 computed the RPL using physical density. Eclipse and SMU uses electron density, though

  5. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  6. Program Correctness, Verification and Testing for Exascale (Corvette)

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Koushik [Univ. of California, Berkeley, CA (United States); Iancu, Costin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Demmel, James W [UC Berkeley

    2018-01-26

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partial program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.

  7. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  8. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  9. Sheath insulator final test report, TFE Verification Program

    International Nuclear Information System (INIS)

    1994-07-01

    The sheath insulator in a thermionic cell has two functions. First, the sheath insulator must electrically isolate the collector form the outer containment sheath tube that is in contact with the reactor liquid metal coolant. Second, The sheath insulator must provide for high uniform thermal conductance between the collector and the reactor coolant to remove away waste heat. The goals of the sheath insulator test program were to demonstrate that suitable ceramic materials and fabrication processes were available, and to validate the performance of the sheath insulator for TFE-VP requirements. This report discusses the objectives of the test program, fabrication development, ex-reactor test program, in-reactor test program, and the insulator seal specifications

  10. Sheath insulator final test report, TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The sheath insulator in a thermionic cell has two functions. First, the sheath insulator must electrically isolate the collector form the outer containment sheath tube that is in contact with the reactor liquid metal coolant. Second, The sheath insulator must provide for high uniform thermal conductance between the collector and the reactor coolant to remove away waste heat. The goals of the sheath insulator test program were to demonstrate that suitable ceramic materials and fabrication processes were available, and to validate the performance of the sheath insulator for TFE-VP requirements. This report discusses the objectives of the test program, fabrication development, ex-reactor test program, in-reactor test program, and the insulator seal specifications.

  11. Containment performance improvement program

    International Nuclear Information System (INIS)

    Beckner, W.; Mitchell, J.; Soffer, L.; Chow, E.; Lane, J.; Ridgely, J.

    1990-01-01

    The Containment Performance Improvement (CPI) program has been one of the main elements in the US Nuclear Regulatory Commission's (NRC's) integrated approach to closure of severe accident issues for US nuclear power plants. During the course of the program, results from various probabilistic risk assessment (PRA) studies and from severe accident research programs for the five US containment types have been examined to identify significant containment challenges and to evaluate potential improvements. The five containment types considered are: the boiling water reactor (BMR) Mark I containment, the BWR Mark II containment, the BWR Mark III containment, the pressurized water reactor (PWR) ice condenser containment, and the PWR dry containments (including both subatmospheric and large subtypes). The focus of the CPI program has been containment performance and accident mitigation, however, insights are also being obtained in the areas of accident prevention and accident management

  12. 50 CFR 216.93 - Tracking and verification program.

    Science.gov (United States)

    2010-10-01

    .... purse seine fishing, processing, and marketing in the United States and abroad. Verification of tracking... the Agreement on the IDCP. (d) Tracking cannery operations. (1) Whenever a U.S. tuna canning company..., canning, sale, rejection, etc.). (4) During canning activities, non-dolphin-safe tuna may not be mixed in...

  13. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  14. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  15. Self-Verification of Ability through Biased Performance Memory.

    Science.gov (United States)

    Karabenick, Stuart A.; LeBlanc, Daniel

    Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…

  16. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  17. 76 FR 41186 - Salmonella Verification Sampling Program: Response to Comments on New Agency Policies and...

    Science.gov (United States)

    2011-07-13

    ... Service [Docket No. FSIS-2008-0008] Salmonella Verification Sampling Program: Response to Comments on New Agency Policies and Clarification of Timeline for the Salmonella Initiative Program (SIP) AGENCY: Food... Federal Register notice (73 FR 4767- 4774), which described upcoming policy changes in the FSIS Salmonella...

  18. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    International Nuclear Information System (INIS)

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  19. REQUIREMENT VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM

    Science.gov (United States)

    2017-09-01

    VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM by Theresa L. Thomas September... ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM 5. FUNDING NUMBERS 6. AUTHOR(S) Theresa L. Thomas 7...CODE 13. ABSTRACT (maximum 200 words) The Naval Air Systems Command (NAVAIR) systems engineering technical review (SETR) process does not

  20. NRC performance assessment program

    International Nuclear Information System (INIS)

    Coplan, S.M.

    1986-01-01

    The U.S. Nuclear Regulatory Commission's (NRC) performance assessment program includes the development of guidance to the U.S. Department of Energy (DOE) on preparation of a license application and on conducting the studies to support a license application. The nature of the licensing requirements of 10 CFR Part 60 create a need for performance assessments by the DOE. The NRC and DOE staffs each have specific roles in assuring the adequacy of those assessments. Performance allocation is an approach for determining what testing and analysis will be needed during site characterization to assure that an adequate data base is available to support the necessary performance assessments. From the standpoint of establishing is implementable methodology, the most challenging performance assessment needed for licensing is the one that will be used to determine compliance with the U.S. Environmental Protection Agency's (EPA) containment requirement

  1. Gamma-ray isotopic ratio measurements for the plutonium inventory verification program

    International Nuclear Information System (INIS)

    Lemming, J.F.; Haas, F.X.; Jarvis, J.Y.

    1976-01-01

    The Plutonium Inventory Verification Program at Mound Laboratory provides a nondestructive means of assaying bulk plutonium-bearing material. The assay is performed by combining the calorimetrically determined heat output of the sample and the relative abundances of the heat-producing isotopes. This report describes the method used for the nondestructive determination of plutonium-238, -240, -241 and americium-241 relative to plutonium-239 using gamma-ray spectroscopy for 93 percent plutonium-239 material. Comparison of chemical data on aliquots of samples to the nondestructive data shows accuracies of +-7 percent for 238 Pu/ 239 Pu, +-15 percent for 240 Pu/ 239 Pu, +- 3 percent for 241 Pu/ 239 Pu, and +-7 percent for 241 Am/ 239 Pu

  2. Fundamentals of successful monitoring, reporting, and verification under a cap-and-trade program

    Energy Technology Data Exchange (ETDEWEB)

    John Schakenbach; Robert Vollaro; Reynaldo Forte [U.S. Environmental Protection Agency, Office of Atmospheric Programs, Washington, DC (United States)

    2006-11-15

    The U.S. Environmental Protection Agency (EPA) developed and implemented the Acid Rain Program (ARP), and NOx Budget Trading Programs (NBTP) using several fundamental monitoring, reporting, and verification (MRV) elements: (1) compliance assurance through incentives and automatic penalties; (2) strong quality assurance (QA); (3) collaborative approach with a petition process; (4) standardized electronic reporting; (5) compliance flexibility for low-emitting sources; (6) complete emissions data record required; (7) centralized administration; (8) level playing field; (9) publicly available data; (10) performance-based approach; and (11) reducing conflicts of interest. Each of these elements is discussed in the context of the authors' experience under two U.S. cap-and-trade programs and their potential application to other cap and-trade programs. The U.S. Office of Management and Budget found that the Acid Rain Program has accounted for the largest quantified human health benefits of any federal regulatory program implemented in the last 10 yr, with annual benefits exceeding costs by {gt} 40 to 1. The authors believe that the elements described in this paper greatly contributed to this success. EPA has used the ARP fundamental elements as a model for other cap-and-trade programs, including the NBTP, which went into effect in 2003, and the recently published Clean Air Interstate Rule and Clean Air Mercury Rule. The authors believe that using these fundamental elements to develop and implement the MRV portion of their cap-and-trade programs has resulted in public confidence in the programs, highly accurate and complete emissions data, and a high compliance rate. 2 refs.

  3. NRC performance indicator program

    International Nuclear Information System (INIS)

    Singh, R.N.

    1987-01-01

    The performance indicator development work of the US Nuclear Regulatory Commission (NRC) interoffice task group involved several major activities that included selection of candidate indicators for a trial program, data collection and review, validation of the trial indicators, display method development, interactions with the industry, and selection of an optimum set of indicators for the program. After evaluating 27 potential indicators against certain ideal attributes, the task group selected 17 for the trial program. The pertinent data for these indicators were then collected from 50 plants at 30 sites. The validation of the indicators consisted of two primary processes: logical validity and statistical analysis. The six indicators currently in the program are scrams, safety system actuations, significant events, safety system failures, forced outage rate, and equipment forced outages per 100 critical hours. A report containing data on the six performance indicators and some supplemental information is issued on a quarterly basis. The NRC staff is also working on refinements of existing indicators and development of additional indicators as directed by the commission

  4. Python high performance programming

    CERN Document Server

    Lanaro, Gabriele

    2013-01-01

    An exciting, easy-to-follow guide illustrating the techniques to boost the performance of Python code, and their applications with plenty of hands-on examples.If you are a programmer who likes the power and simplicity of Python and would like to use this language for performance-critical applications, this book is ideal for you. All that is required is a basic knowledge of the Python programming language. The book will cover basic and advanced topics so will be great for you whether you are a new or a seasoned Python developer.

  5. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  6. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  7. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  8. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  9. On Verification of PLC-Programs Written in the LD-Language

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2012-01-01

    Full Text Available We discuss some questions connected with the construction of a technology of analysing correctness of Programmable Logic Controller programs. We consider an example of modeling and automated verification of PLC-programs written in the Ladder Diagram language (including timed function blocks of the IEC 61131-3 standard. We use the Cadence SMV for symbolic model checking. Program properties are written in the linear-time temporal logic LTL.

  10. Review of Evaluation, Measurement and Verification Approaches Used to Estimate the Load Impacts and Effectiveness of Energy Efficiency Programs

    Energy Technology Data Exchange (ETDEWEB)

    Messenger, Mike; Bharvirkar, Ranjit; Golemboski, Bill; Goldman, Charles A.; Schiller, Steven R.

    2010-04-14

    Efficiency (2007) presented commonly used definitions for EM&V in the context of energy efficiency programs: (1) Evaluation (E) - The performance of studies and activities aimed at determining the effects and effectiveness of EE programs; (2) Measurement and Verification (M&V) - Data collection, monitoring, and analysis associated with the calculation of gross energy and demand savings from individual measures, sites or projects. M&V can be a subset of program evaluation; and (3) Evaluation, Measurement, and Verification (EM&V) - This term is frequently seen in evaluation literature. EM&V is a catchall acronym for determining both the effectiveness of program designs and estimates of load impacts at the portfolio, program and project level. This report is a scoping study that assesses current practices and methods in the evaluation, measurement and verification (EM&V) of ratepayer-funded energy efficiency programs, with a focus on methods and practices currently used for determining whether projected (ex-ante) energy and demand savings have been achieved (ex-post). M&V practices for privately-funded energy efficiency projects (e.g., ESCO projects) or programs where the primary focus is greenhouse gas reductions were not part of the scope of this study. We identify and discuss key purposes and uses of current evaluations of end-use energy efficiency programs, methods used to evaluate these programs, processes used to determine those methods; and key issues that need to be addressed now and in the future, based on discussions with regulatory agencies, policymakers, program administrators, and evaluation practitioners in 14 states and national experts in the evaluation field. We also explore how EM&V may evolve in a future in which efficiency funding increases significantly, innovative mechanisms for rewarding program performance are adopted, the role of efficiency in greenhouse gas mitigation is more closely linked, and programs are increasingly funded from multiple sources

  11. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    International Nuclear Information System (INIS)

    Gauld, Ian C.; Hu, Jianwei; De Baere, P.; Tobin, Stephen

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy-EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  12. In-Field Performance Testing of the Fork Detector for Quantitative Spent Fuel Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Jianwei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); De Baere, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Vaccaro, S. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Schwalbach, P. [European Commission (Luxembourg). DG Energy, Directorate Nuclear Safeguards; Liljenfeldt, Henrik [Swedish Nuclear Fuel and Waste Management Company (Sweden); Tobin, Stephen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-01

    Expanding spent fuel dry storage activities worldwide are increasing demands on safeguards authorities that perform inspections. The European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) require measurements to verify declarations when spent fuel is transferred to difficult-to-access locations, such as dry storage casks and the repositories planned in Finland and Sweden. EURATOM makes routine use of the Fork detector to obtain gross gamma and total neutron measurements during spent fuel inspections. Data analysis is performed by modules in the integrated Review and Analysis Program (iRAP) software, developed jointly by EURATOM and the IAEA. Under the framework of the US Department of Energy–EURATOM cooperation agreement, a module for automated Fork detector data analysis has been developed by Oak Ridge National Laboratory (ORNL) using the ORIGEN code from the SCALE code system and implemented in iRAP. EURATOM and ORNL recently performed measurements on 30 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel (Clab), operated by the Swedish Nuclear Fuel and Waste Management Company (SKB). The measured assemblies represent a broad range of fuel characteristics. Neutron count rates for 15 measured pressurized water reactor assemblies are predicted with an average relative standard deviation of 4.6%, and gamma signals are predicted on average within 2.6% of the measurement. The 15 measured boiling water reactor assemblies exhibit slightly larger deviations of 5.2% for the gamma signals and 5.7% for the neutron count rates, compared to measurements. These findings suggest that with improved analysis of the measurement data, existing instruments can provide increased verification of operator declarations of the spent fuel and thereby also provide greater ability to confirm integrity of an assembly. These results support the application of the Fork detector as a fully quantitative spent fuel

  13. Automata-Based Verification of Temporal Properties on Running Programs

    Science.gov (United States)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  14. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  15. Development of a standard for computer program verification and control

    International Nuclear Information System (INIS)

    Dunn, T.E.; Ozer, O.

    1980-01-01

    It is expected that adherence to the guidelines of the ANS 10.4 will: 1. Provide confidence that the program conforms to its requirements specification; 2. Provide confidence that the computer program has been adequately evaluated and tested; 3. Provide confidence that program changes are adequately evaluated, tested, and controlled; and 4. Enhance assurance that reliable data will be produced for engineering, scientific, and safety analysis purposes

  16. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real

  17. North Korea's nuclear weapons program:verification priorities and new challenges.

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Duk-ho (Korean Consulate General in New York)

    2003-12-01

    A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.

  18. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  19. TRACEABILITY OF PRECISION MEASUREMENTS ON COORDINATE MEASURING MACHINES – PERFORMANCE VERIFICATION OF CMMs

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Sobiecki, René; Tosello, Guido

    This document is used in connection with one exercise of 30 minutes duration as a part of the course VISION ONLINE – One week course on Precision & Nanometrology. The exercise concerns performance verification of the volumetric measuring capability of a small volume coordinate measuring machine...

  20. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  1. Human Performance Westinghouse Program

    International Nuclear Information System (INIS)

    Garcia Gutierrez, A.; Gil, C.

    2010-01-01

    The objective of the Program consists in the excellence actuation, achieving the client success with a perfect realisation project. This program consists of different basic elements to reduce the human mistakes: the HuP tools, coaching, learning clocks and iKnow website. There is, too, a document file to consult and practice. All these elements are expounded in this paper.

  2. WNetKAT: A Weighted SDN Programming and Verification Language

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Schmid, Stefan; Xue, Bingtian

    2017-01-01

    Programmability and verifiability lie at the heart of the software-defined networking paradigm. While OpenFlow and its match-action concept provide primitive operations to manipulate hardware configurations, over the last years, several more expressive network programming languages have been...... developed. This paper presents WNetKAT, the first network programming language accounting for the fact that networks are inherently weighted, and communications subject to capacity constraints (e.g., in terms of bandwidth) and costs (e.g., latency or monetary costs). WNetKAT is based on a syntactic...... generalize to more complex (and stateful) network functions and service chains. For example, WNetKAT allows to model flows which need to traverse certain waypoint functions, which can change the traffic rate. This paper also shows the relationship between the equivalence problem of WNet...

  3. TFE design package final report, TFE Verification Program

    International Nuclear Information System (INIS)

    1994-06-01

    The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. A TFE for a megawatt class system is described. Only six cells are considered for simplicity; a megawatt class TFE would have many more cells, the exact number dependent on optimization trade studies

  4. Verification of Concurrent Programs. Part II. Temporal Proof Principles.

    Science.gov (United States)

    1981-09-01

    not modify any of the shared program variables. In order to ensure the correct synchronization between the processes we use three semaphore variables...direct, simple, and intuitive rides for the establishment of these properties. rhey usually replace long but repetitively similar chains of primitive ...modify the variables on which Q actually depends. A typical case is that of semaphores . We have the following property: The Semaphore Variable Rule

  5. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  6. Verification of a Program for the Control of a Robotic Workcell with the Use of AR

    Directory of Open Access Journals (Sweden)

    Jozef Novak-Marcincin

    2012-08-01

    Full Text Available This paper contributes in the form of a theoretical discussion and also, by the presentation of a practical example, brings information about the utilization possibilities of elements of augmented reality for the creation of programs for the control of a robotic workplace and for their simulated verification. In the beginning it provides an overview of the current state in the area of robotic systems with the use of unreal objects and describes existing and assumed attitudes. The next part describes an experimental robotic workplace. Then it clarifies the realization of a new way of verification of the program for robotic workplace control and provides information about the possibilities for further development of created functioning concepts.

  7. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  8. Verification of Memory Performance Contracts with KeY

    OpenAIRE

    Engel, Christian

    2007-01-01

    Determining the worst case memory consumption is an important issue for real-time Java applications. This work describes a methodology for formally verifying worst case memory performance constraints and proposes extensions to Java Modeling Language (JML) facilitating better verifiability of JML performance specifications.

  9. Construction and Verification of PLC LD-programs by LTL-specification

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2013-01-01

    Full Text Available An approach to construction and verification of PLC LD-programs for discrete problems is proposed. For the specification of the program behavior, we use the linear-time temporal logic LTL. Programming is carried out in the LD-language (Ladder Diagram according to an LTL-specification. The correctness analysis of an LTL-specification is carried out by the symbolic model checking tool Cadence SMV. A new approach to programming and verification of PLC LD-programs is shown by an example. For a discrete problem, we give a LD-program, its LTL-specification and an SMV-model. The purpose of the article is to describe an approach to programming PLC, which would provide a possibility of LD-program correctness analysis by the model checking method. Under the proposed approach, the change of the value of each program variable is described by a pair of LTL-formulas. The first LTL-formula describes situations which increase the value of the corresponding variable, the second LTL-formula specifies conditions leading to a decrease of the variable value. The LTL-formulas (used for speci- fication of the corresponding variable behavior are constructive in the sense that they construct the PLC-program (LD-program, which satisfies temporal properties expressed by these formulas. Thus, the programming of PLC is reduced to the construction of LTLspecification of the behavior of each program variable. In addition, an SMV-model of a PLC LD-program is constructed according to LTL-specification. Then, the SMV-model is analysed by the symbolic model checking tool Cadence SMV.

  10. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  11. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  12. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  13. Field Verification Program for Small Wind Turbines: Quarterly Report for January-March 2001; 1st Quarter, Issue No.4

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, T.; Cardinal, J.

    2001-10-30

    This newsletter provides a brief overview of the Field Verification Program for Small Wind Turbines conducted out of the NWTC and a description of current activities. The newsletter also contains case studies of current projects.

  14. Field Verification Program for Small Wind Turbines: Quarterly Report for October-December 2000; 4th Quarter, Iss. No.3

    Energy Technology Data Exchange (ETDEWEB)

    Cardinal, J.

    2001-07-03

    This newsletter provides a brief overview of the Field Verification Program for Small Wind Turbines conducted out of the NWTC and a description of current activities. The newsletter also contains case studies of current projects.

  15. Field Verification Program for Small Wind Turbines, Quarterly Report: 3rd Quarter, Issue No.2, July-September 2000

    Energy Technology Data Exchange (ETDEWEB)

    Cardinal. J.; Tu, P.

    2001-05-16

    This newsletter provides a brief overview of the Field Verification Program for Small Wind Turbines conducted out of the NWTC and a description of current activities. The newsletter also contains case studies of current projects.

  16. MSFC Turbine Performance Optimization (TPO) Technology Verification Status

    Science.gov (United States)

    Griffin, Lisa W.; Dorney, Daniel J.; Snellgrove, Lauren M.; Zoladz, Thomas F.; Stroud, Richard T.; Turner, James E. (Technical Monitor)

    2002-01-01

    Capability to optimize for turbine performance and accurately predict unsteady loads will allow for increased reliability, Isp, and thrust-to-weight. The development of a fast, accurate, validated aerodynamic design, analysis, and optimization system is required.

  17. Towards deductive verification of MPI programs against session types

    Directory of Open Access Journals (Sweden)

    Eduardo R. B. Marques

    2013-12-01

    Full Text Available The Message Passing Interface (MPI is the de facto standard message-passing infrastructure for developing parallel applications. Two decades after the first version of the library specification, MPI-based applications are nowadays routinely deployed on super and cluster computers. These applications, written in C or Fortran, exhibit intricate message passing behaviours, making it hard to statically verify important properties such as the absence of deadlocks. Our work builds on session types, a theory for describing protocols that provides for correct-by-construction guarantees in this regard. We annotate MPI primitives and C code with session type contracts, written in the language of a software verifier for C. Annotated code is then checked for correctness with the software verifier. We present preliminary results and discuss the challenges that lie ahead for verifying realistic MPI program compliance against session types.

  18. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  19. Verification of OpenSSL version via hardware performance counters

    Science.gov (United States)

    Bruska, James; Blasingame, Zander; Liu, Chen

    2017-05-01

    Many forms of malware and security breaches exist today. One type of breach downgrades a cryptographic program by employing a man-in-the-middle attack. In this work, we explore the utilization of hardware events in conjunction with machine learning algorithms to detect which version of OpenSSL is being run during the encryption process. This allows for the immediate detection of any unknown downgrade attacks in real time. Our experimental results indicated this detection method is both feasible and practical. When trained with normal TLS and SSL data, our classifier was able to detect which protocol was being used with 99.995% accuracy. After the scope of the hardware event recording was enlarged, the accuracy diminished greatly, but to 53.244%. Upon removal of TLS 1.1 from the data set, the accuracy returned to 99.905%.

  20. Experimental Verification Of Hyper-V Performance Isolation Level

    Directory of Open Access Journals (Sweden)

    Krzysztof Rzecki

    2014-01-01

    Full Text Available The need for cost optimization in a broad sense constitutes the basis of operation of every enterprise. In the case of IT structure, which is present in almost every field of activity these days, one of the most commonly applied technologies leading to good cost-to-profit adjustment is virtualization. It consists in locating several operational systems with IT systems on a single server. In order for such optimization to be carried out correctly it has to be strictly controlled by means of allocating access to resources, which is known as performance isolation. Modern virtualizers allow to set up this allocation in quantitative terms (the number of processors, size of RAM, or disc space. It appears, however, that in qualitative terms (processor's time, RAM or hard disc bandwidth the actual allocation of resources does not always correspond with this configuration. This paper provides an experimental presentation of the achievable level of performance isolation of the Hyper-V virtualizer.

  1. Verification and Performance Evaluation of Timed Game Strategies

    DEFF Research Database (Denmark)

    David, Alexandre; Fang, Huixing; Larsen, Kim Guldstrand

    2014-01-01

    Control synthesis techniques, based on timed games, derive strategies to ensure a given control objective, e.g., time-bounded reachability. Model checking verifies correctness properties of systems. Statistical model checking can be used to analyse performance aspects of systems, e.g., energy...... consumption. In this work, we propose to combine these three techniques. In particular, given a strategy synthesized for a timed game and a given control objective, we want to make a deeper examination of the consequences of adopting this strategy. Firstly, we want to apply model checking to the timed game...... under the synthesized strategy in order to verify additional correctness properties. Secondly, we want to apply statistical model checking to evaluate various performance aspects of the synthesized strategy. For this, the underlying timed game is extended with relevant price and stochastic information...

  2. Performance Verification on UWB Antennas for Breast Cancer Detection

    Directory of Open Access Journals (Sweden)

    Vijayasarveswari V.

    2017-01-01

    Full Text Available Breast cancer is a common disease among women and death figure is continuing to increase. Early breast cancer detection is very important. Ultra wide-band (UWB is the promising candidate for short communication applications. This paper presents the performance of different types of UWB antennas for breast cancer detection. Two types of antennas are used i.e: UWB pyramidal antenna and UWB horn antenna. These antennas are used to transmit and receive the UWB signal. The collected signals are fed into developed neural network module to measure the performance efficiency of each antenna. The average detection efficiency is 88.46% and 87.55% for UWB pyramidal antenna and UWB horn antenna respectively. These antennas can be used to detect breast cancer in the early stage and save precious lives.

  3. Clojure high performance programming

    CERN Document Server

    Kumar, Shantanu

    2013-01-01

    This is a short, practical guide that will teach you everything you need to know to start writing high performance Clojure code.This book is ideal for intermediate Clojure developers who are looking to get a good grip on how to achieve optimum performance. You should already have some experience with Clojure and it would help if you already know a little bit of Java. Knowledge of performance analysis and engineering is not required. For hands-on practice, you should have access to Clojure REPL with Leiningen.

  4. Instrument performance and simulation verification of the POLAR detector

    Science.gov (United States)

    Kole, M.; Li, Z. H.; Produit, N.; Tymieniecka, T.; Zhang, J.; Zwolinska, A.; Bao, T. W.; Bernasconi, T.; Cadoux, F.; Feng, M. Z.; Gauvin, N.; Hajdas, W.; Kong, S. W.; Li, H. C.; Li, L.; Liu, X.; Marcinkowski, R.; Orsi, S.; Pohl, M.; Rybka, D.; Sun, J. C.; Song, L. M.; Szabelski, J.; Wang, R. J.; Wang, Y. H.; Wen, X.; Wu, B. B.; Wu, X.; Xiao, H. L.; Xiong, S. L.; Zhang, L.; Zhang, L. Y.; Zhang, S. N.; Zhang, X. F.; Zhang, Y. J.; Zhao, Y.

    2017-11-01

    POLAR is a new satellite-born detector aiming to measure the polarization of an unprecedented number of Gamma-Ray Bursts in the 50-500 keV energy range. The instrument, launched on-board the Tiangong-2 Chinese Space lab on the 15th of September 2016, is designed to measure the polarization of the hard X-ray flux by measuring the distribution of the azimuthal scattering angles of the incoming photons. A detailed understanding of the polarimeter and specifically of the systematic effects induced by the instrument's non-uniformity are required for this purpose. In order to study the instrument's response to polarization, POLAR underwent a beam test at the European Synchrotron Radiation Facility in France. In this paper both the beam test and the instrument performance will be described. This is followed by an overview of the Monte Carlo simulation tools developed for the instrument. Finally a comparison of the measured and simulated instrument performance will be provided and the instrument response to polarization will be presented.

  5. Research on Elemental Technology of Advanced Nuclear Fuel Performance Verification

    International Nuclear Information System (INIS)

    Kim, Yong Soo; Lee, Dong Uk; Jean, Sang Hwan; Koo, Min

    2003-04-01

    Most of current properties models and fuel performance models used in the performance evaluation codes are based on the in-pile data up to 33,000 MWd/MtU. Therefore, international experts are investigating the properties changes and developing advanced prediction models for high burn-up application. Current research is to develop high burn-up fission gas release model for the code and to support the code development activities by collecting data and models, reviewing/assessing the data and models together, and benchmarking the selected models against the appropriate in-pile data. For high burn-up applications, two stage two step fission gas release model is developed based on the real two diffusion process in the grain lattice and grain boundaries of the fission gases and the observation of accelerated release rate in the high burn-up. It is found that the prediction of this model is in excellent agreement with the in-pile measurement results, not only in the low burn-up but also in the high burn-up. This research is found that the importance of thermal conductivity of oxide fuel, especially in the high burn-up, is focused again. It is found that even the temperature dependent models differ from one to another and most of them overestimate the conductivity in the high burn-up. An in-pile data benchmarking of high LHGR fuel rod shows that the difference can reach 30%∼40%, which predicts 400 .deg. C lower than the real fuel centerline temperature. Recent models on the thermal expansion and heat capacity of oxide fuel are found to be well-defined. Irradiation swelling of the oxide fuel are now well-understood that in most cases in LWRs solid fission product swelling is dominant. Thus, the accumulation of in-pile data can enhance the accuracy of the model prediction, rather than theoretical modeling works. Thermo-physical properties of Zircaloy cladding are also well-defined and well-understood except the thermal expansion. However, it turns out that even the

  6. A study on periodic safety verification on MOV performance

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Du Eon; Park, Jong Ho; Han, Jae Seob; Kang, Hyeon Taek; Lee, Jeong Min; Song, Kyu Jo; Shin, Wan Sun; Lee, Taek Sang [Chungnam National Univ., Taejon (Korea, Republic of)

    2000-03-15

    The objectives of this study, therefore, are to define the optimized valve diagnostic variances which early detect the abnormal conditions during the surveillance of the valve and consequently reduce the radiation exposure. The major direction of the development is to detect in advance the valve degradation by monitoring the motor current and power signals which can be obtained remotely at Motor Control Center (MCC). A series of valve operation experiments have been performed under several kinds of abnormal conditions by using the test apparatus which consists of a 3-inch gate valve, a motor(0.33 Hp, 460V, 0.8A, 1560rpm), actuator(SMB-000-2 type), some measuring devices(power analyzer, oscilloscope, data recorder and current transformer, AC current and voltage transducer) and connection cables.

  7. Verification of the CONPAS (CONtainment Performance Analysis System) code package

    International Nuclear Information System (INIS)

    Kim, See Darl; Ahn, Kwang Il; Song, Yong Man; Choi, Young; Park, Soo Yong; Kim, Dong Ha; Jin, Young Ho.

    1997-09-01

    CONPAS is a computer code package to integrate the numerical, graphical, and results-oriented aspects of Level 2 probabilistic safety assessment (PSA) for nuclear power plants under a PC window environment automatically. For the integrated analysis of Level 2 PSA, the code utilizes four distinct, but closely related modules: (1) ET Editor, (2) Computer, (3) Text Editor, and (4) Mechanistic Code Plotter. Compared with other existing computer codes for Level 2 PSA, and CONPAS code provides several advanced features: computational aspects including systematic uncertainty analysis, importance analysis, sensitivity analysis and data interpretation, reporting aspects including tabling and graphic as well as user-friendly interface. The computational performance of CONPAS has been verified through a Level 2 PSA to a reference plant. The results of the CONPAS code was compared with an existing level 2 PSA code (NUCAP+) and the comparison proves that CONPAS is appropriate for Level 2 PSA. (author). 9 refs., 8 tabs., 14 figs

  8. Performance verification of the CMS Phase-1 Upgrade Pixel detector

    Science.gov (United States)

    Veszpremi, V.

    2017-12-01

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased instantaneous luminosity the LHC would reach before 2023. It was built to operate at an instantaneous luminosity of around 2×1034 cm-2s-1. The detector's new layout has an additional inner layer with respect to the previous one; it allows for more efficient tracking with smaller fake rate at higher event pile-up. The paper focuses on the first results obtained during the commissioning of the new detector. It also includes challenges faced during the first data taking to reach the optimal measurement efficiency. Details will be given on the performance at high occupancy with respect to observables such as data-rate, hit reconstruction efficiency, and resolution.

  9. Measurement and Verification of Energy Savings and Performance from Advanced Lighting Controls

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-02-21

    This document provides a framework for measurement and verification (M&V) of energy savings, performance, and user satisfaction from lighting retrofit projects involving occupancy-sensor-based, daylighting, and/or other types of automatic lighting. It was developed to provide site owners, contractors, and other involved organizations with the essential elements of a robust M&V plan for retrofit projects and to assist in developing specific project M&V plans.

  10. On Demand Internal Short Circuit Device Enables Verification of Safer, Higher Performing Battery Designs

    Energy Technology Data Exchange (ETDEWEB)

    Darcy, Eric; Keyser, Matthew

    2017-05-15

    The Internal Short Circuit (ISC) device enables critical battery safety verification. With the aluminum interstitial heat sink between the cells, normal trigger cells cannot be driven into thermal runaway without excessive temperature bias of adjacent cells. With an implantable, on-demand ISC device, thermal runaway tests show that the conductive heat sinks protected adjacent cells from propagation. High heat dissipation and structural support of Al heat sinks show high promise for safer, higher performing batteries.

  11. TRACEABILITY OF ON COORDINATE MEASURING MACHINES – CALIBRATION AND PERFORMANCE VERIFICATION

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Savio, Enrico; Bariani, Paolo

    This document is used in connection with three exercises each of 45 minutes duration as a part of the course GEOMETRICAL METROLOGY AND MACHINE TESTING. The exercises concern three aspects of coordinate measurement traceability: 1) Performance verification of a CMM using a ball bar; 2) Calibration...... of an optical coordinate measuring machine; 3) Uncertainty assessment using the ISO 15530-3 “Calibrated workpieces” procedure....

  12. Modelling and Formal Verification of Timing Aspects in Large PLC Programs

    CERN Document Server

    Fernandez Adiego, B; Blanco Vinuela, E; Tournier, J-C; Gonzalez Suarez, V M; Blech, J O

    2014-01-01

    One of the main obstacle that prevents model checking from being widely used in industrial control systems is the complexity of building formal models out of PLC programs, especially when timing aspects need to be integrated. This paper brings an answer to this obstacle by proposing a methodology to model and verify timing aspects of PLC programs. Two approaches are proposed to allow the users to balance the trade-off between the complexity of the model, i.e. its number of states, and the set of specifications possible to be verified. A tool supporting the methodology which allows to produce models for different model checkers directly from PLC programs has been developed. Verification of timing aspects for real-life PLC programs are presented in this paper using NuSMV.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  14. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  15. Program Performance Inventory: Six Juvenile Offender Programs.

    Science.gov (United States)

    Thomalla, Terri Groff; Dougherty, Victoria J.

    This report describes the performance of 6 Connecticut juvenile justice alternative sanction programs in 14 qualitative areas: community reintegration; outcomes and evaluation; assessment methods; risk factors; escalation of criminal activity; family involvement; community involvement; work ethic and vocational training; education and life skills;…

  16. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  17. EPRI MOV performance prediction program

    International Nuclear Information System (INIS)

    Hosler, J.F.; Damerell, P.S.; Eidson, M.G.; Estep, N.E.

    1994-01-01

    An overview of the EPRI Motor-Operated Valve (MOV) Performance Prediction Program is presented. The objectives of this Program are to better understand the factors affecting the performance of MOVs and to develop and validate methodologies to predict MOV performance. The Program involves valve analytical modeling, separate-effects testing to refine the models, and flow-loop and in-plant MOV testing to provide a basis for model validation. The ultimate product of the Program is an MOV Performance Prediction Methodology applicable to common gate, globe, and butterfly valves. The methodology predicts thrust and torque requirements at design-basis flow and differential pressure conditions, assesses the potential for gate valve internal damage, and provides test methods to quantify potential for gate valve internal damage, and provides test methods to quantify potential variations in actuator output thrust with loading condition. Key findings and their potential impact on MOV design and engineering application are summarized

  18. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  19. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  20. EPID-based verification of the MLC performance for dynamic IMRT and VMAT

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; Barnes, Michael P.; O’Connor, Daryl J.; Greer, Peter B.

    2012-01-01

    Purpose: In advanced radiotherapy treatments such as intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), verification of the performance of the multileaf collimator (MLC) is an essential part of the linac QA program. The purpose of this study is to use the existing measurement methods for geometric QA of the MLCs and extend them to more comprehensive evaluation techniques, and to develop dedicated robust algorithms to quantitatively investigate the MLC performance in a fast, accurate, and efficient manner. Methods: The behavior of leaves was investigated in the step-and-shoot mode by the analysis of integrated electronic portal imaging device (EPID) images acquired during picket fence tests at fixed gantry angles and arc delivery. The MLC was also studied in dynamic mode by the analysis of cine EPID images of a sliding gap pattern delivered in a variety of conditions including different leaf speeds, deliveries at fixed gantry angles or in arc mode, and changing the direction of leaf motion. The accuracy of the method was tested by detection of the intentionally inserted errors in the delivery patterns. Results: The algorithm developed for the picket fence analysis was able to find each individual leaf position, gap width, and leaf bank skewness in addition to the deviations from expected leaf positions with respect to the beam central axis with sub-pixel accuracy. For the three tested linacs over a period of 5 months, the maximum change in the gap width was 0.5 mm, the maximum deviation from the expected leaf positions was 0.1 mm and the MLC skewness was up to 0.2°. The algorithm developed for the sliding gap analysis could determine the velocity and acceleration/deceleration of each individual leaf as well as the gap width. There was a slight decrease in the accuracy of leaf performance with increasing leaf speeds. The analysis results were presented through several graphs. The accuracy of the method was assessed as 0.01 mm

  1. Calibration And Performance Verification Of LSC Packard 1900TR AFTER REPAIRING

    International Nuclear Information System (INIS)

    Satrio; Evarista-Ristin; Syafalni; Alip

    2003-01-01

    Calibration process and repeated verification of LSC Packard 1900TR at Hydrology Section-P3TlR has been done. In the period of middle 1997 to July 2000, the counting system of the instrument has damaged and repaired for several times. After repairing, the system was recalibrated and then verified. The calibration and verification were conducted by using standard 3 H, 14 C and background unquenched. The result of calibration shows that background count rates of 3 H and 14 C is 12.3 ± 0.79 cpm and 18.24 ± 0.69 cpm respectively; FOM 3 H and 14 C is 285.03 ± 15.95 and 641.06 ± 16.45 respectively; 3 H and 14 C efficiency is 59.13 ± 0.28 % and 95.09 ± 0.31 %. respectively. From the verification data's, the parameter of SIS and tSIE for 14 C is to be in range of limit. And then 3 H and 14 C efficiency is still above minimum limit. Whereas, the background fluctuation still show normal condition. It could be concluded that until now the performance of LSC Packard 1900TR is well condition and could be used for counting. (author)

  2. Reinforcing of QA/QC programs in radiotherapy departments in Croatia: Results of treatment planning system verification

    Energy Technology Data Exchange (ETDEWEB)

    Jurković, Slaven; Švabić, Manda; Diklić, Ana; Smilović Radojčić, Đeni; Dundara, Dea [Clinic for Radiotherapy and Oncology, Physics Division, University Hospital Rijeka, Rijeka (Croatia); Kasabašić, Mladen; Ivković, Ana [Department for Radiotherapy and Oncology, University Hospital Osijek, Osijek (Croatia); Faj, Dario, E-mail: dariofaj@mefos.hr [Department of Physics, School of Medicine, University of Osijek, Osijek (Croatia)

    2013-04-01

    Implementation of advanced techniques in clinical practice can greatly improve the outcome of radiation therapy, but it also makes the process much more complex with a lot of room for errors. An important part of the quality assurance program is verification of treatment planning system (TPS). Dosimetric verifications in anthropomorphic phantom were performed in 4 centers where new systems were installed. A total of 14 tests for 2 photon energies and multigrid superposition algorithms were conducted using the CMS XiO TPS. Evaluation criteria as specified in the International Atomic Energy Agency Technical Reports Series (IAEA TRS) 430 were employed. Results of measurements are grouped according to the placement of the measuring point and the beam energy. The majority of differences between calculated and measured doses in the water-equivalent part of the phantom were in tolerance. Significantly more out-of-tolerance values were observed in “nonwater-equivalent” parts of the phantom, especially for higher-energy photon beams. This survey was done as a part of continuous effort to build up awareness of quality assurance/quality control (QA/QC) importance in the Croatian radiotherapy community. Understanding the limitations of different parts of the various systems used in radiation therapy can systematically improve quality as well.

  3. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  4. Environmental Technology Verification Report - Electric Power and Heat Production Using Renewable Biogas at Patterson Farms

    Science.gov (United States)

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  5. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  6. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  7. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  8. Overview of the Hanford Site Performance Assurance Program

    International Nuclear Information System (INIS)

    Duncan, M.R.; Billings, M.P.; Delvin, W.L.; Scott, D.D.; Weatherby, J.W.

    1991-01-01

    This paper reports on a safeguards and security performance assurance program which encompasses the routine and special activities carried out to assure that safeguards and security subsystems and components are operating in a effective and reliable manner. At the Hanford Site, performance assurance involves widely varied activities, e.g., force-on-force exercises, functional testing of security components, and limited scope performance testing of material control and accountability subsystems. These activities belong to one of four categories: performance testing, functional testing, inspection, and preventive maintenance. Using categories has aided in identifying and assessing the relevant contribution each activity makes to the performance assurance program. Efforts have progressed toward incorporating performance assurance activities into the assessment of protection effectiveness required for Master Safeguards and Security Agreement development and its associated verification and validation process

  9. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  10. Performance Demonstration Program Management Plan

    International Nuclear Information System (INIS)

    2005-01-01

    To demonstrate compliance with the Waste Isolation Pilot Plant (WIPP) waste characterization program, each testing and analytical facility performing waste characterization activities participates in the Performance Demonstration Program (PDP). The PDP serves as a quality control check against expected results and provides information about the quality of data generated in the characterization of waste destined for WIPP. Single blind audit samples are prepared and distributed by an independent organization to each of the facilities participating in the PDP. There are three elements within the PDP: analysis of simulated headspace gases, analysis of solids for Resource Conservation and Recovery Act (RCRA) constituents, and analysis for transuranic (TRU) radionuclides using nondestructive assay (NDA) techniques. Because the analysis for TRU radionuclides using NDA techniques involves both the counting of drums and standard waste boxes, four PDP plans are required to describe the activities of the three PDP elements. In accordance with these PDP plans, the reviewing and approving authority for PDP results and for the overall program is the CBFO PDP Appointee. The CBFO PDP Appointee is responsible for ensuring the implementation of each of these plans by concurring with the designation of the Program Coordinator and by providing technical oversight and coordination for the program. The Program Coordinator will designate the PDP Manager, who will coordinate the three elements of the PDP. The purpose of this management plan is to identify how the requirements applicable to the PDP are implemented during the management and coordination of PDP activities. The other participants in the program (organizations that perform site implementation and activities under CBFO contracts or interoffice work orders) are not covered under this management plan. Those activities are governed by the organization's quality assurance (QA) program and procedures or as otherwise directed by CBFO.

  11. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  12. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  13. Linear models to perform treaty verification tasks for enhanced information security

    Energy Technology Data Exchange (ETDEWEB)

    MacGahan, Christopher J., E-mail: cmacgahan@optics.arizona.edu [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Sandia National Laboratories, Livermore, CA 94551 (United States); Kupinski, Matthew A. [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  14. The data requirements for the verification and validation of a fuel performance code - the transuranus perspective

    International Nuclear Information System (INIS)

    Schubert, A.; Di Marcello, V.; Rondinella, V.; Van De Laar, J.; Van Uffelen, P.

    2013-01-01

    In general, the verification and validation (V and V) of a fuel performance code like TRANSURANUS consists of three basic steps: a) verifying the correctness and numerical stability of the sub-models; b) comparing the sub-models with experimental data; c) comparing the results of the integral fuel performance code with experimental data Only the second and third steps of the V and V rely on experimental information. This scheme can be further detailed according to the physical origin of the data: on one hand, in-reactor ('in-pile') experimental data are generated in the course of the irradiation; on the other hand ex-reactor ('out-of-pile') experimental data are obtained for instance from various postirradiation examinations (PIE) or dedicated experiments with fresh samples. For both categories, we will first discuss the V and V of sub-models of TRANSURANUS related to separate aspects of the fuel behaviour: this includes the radial variation of the composition and fissile isotopes, the thermal properties of the fuel (e.g. thermal conductivity, melting temperature, etc.), the mechanical properties of fuel and cladding (e.g. elastic constants, creep properties), as well as the models for the fission product behaviour. Secondly, the integral code verification will be addressed as it treats various aspects of the fuel behaviour, including the geometrical changes in the fuel and the gas pressure and composition of the free volume in the rod. (authors)

  15. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  16. Optimal trajectories for flexible-link manipulator slewing using recursive quadratic programming: Experimental verification

    International Nuclear Information System (INIS)

    Parker, G.G.; Eisler, G.R.; Feddema, J.T.

    1994-01-01

    Procedures for trajectory planning and control of flexible link robots are becoming increasingly important to satisfy performance requirements of hazardous waste removal efforts. It has been shown that utilizing link flexibility in designing open loop joint commands can result in improved performance as opposed to damping vibration throughout a trajectory. The efficient use of link compliance is exploited in this work. Specifically, experimental verification of minimum time, straight line tracking using a two-link planar flexible robot is presented. A numerical optimization process, using an experimentally verified modal model, is used for obtaining minimum time joint torque and angle histories. The optimal joint states are used as commands to the proportional-derivative servo actuated joints. These commands are precompensated for the nonnegligible joint servo actuator dynamics. Using the precompensated joint commands, the optimal joint angles are tracked with such fidelity that the tip tracking error is less than 2.5 cm

  17. Free and Reduced-Price Meal Application and Income Verification Practices in School Nutrition Programs in the United States

    Science.gov (United States)

    Kwon, Junehee; Lee, Yee Ming; Park, Eunhye; Wang, Yujia; Rushing, Keith

    2017-01-01

    Purpose/Objectives: This study assessed current practices and attitudes of school nutrition program (SNP) management staff regarding free and reduced-price (F-RP) meal application and verification in SNPs. Methods: Stratified, randomly selected 1,500 SNP management staff in 14 states received a link to an online questionnaire and/or a printed…

  18. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  19. Towards measurement and verification of energy performance under the framework of the European directive for energy performance of buildings

    International Nuclear Information System (INIS)

    Burman, Esfand; Mumovic, Dejan; Kimpian, Judit

    2014-01-01

    Directive 2002/91/EC of the European Parliament and Council on the Energy Performance of Buildings has led to major developments in energy policies followed by the EU Member States. The national energy performance targets for the built environment are mostly rooted in the Building Regulations that are shaped by this Directive. Article 3 of this Directive requires a methodology to calculate energy performance of buildings under standardised operating conditions. Overwhelming evidence suggests that actual energy performance is often significantly higher than this standardised and theoretical performance. The risk is national energy saving targets may not be achieved in practice. The UK evidence for the education and office sectors is presented in this paper. A measurement and verification plan is proposed to compare actual energy performance of a building with its theoretical performance using calibrated thermal modelling. Consequently, the intended vs. actual energy performance can be established under identical operating conditions. This can help identify the shortcomings of construction process and building procurement. Once energy performance gap is determined with reasonable accuracy and root causes identified, effective measures could be adopted to remedy or offset this gap. - Highlights: • Building energy performance gap is a negative externality that must be addressed. • A method is proposed to link actual performance to building compliance calculation. • Energy performance gap is divided into procurement and operational gaps. • This framework enables policy makers to measure and address procurement gap. • Building fine-tuning by construction teams could also narrow operational gap

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, UNIVERSITY OF TENNESSEE RESEARCH CORPORATION, SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  2. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    International Nuclear Information System (INIS)

    Kawai, D; Takahashi, R; Kamima, T; Baba, H; Yamamoto, T; Kubo, Y; Ishibashi, S; Higuchi, Y; Takahashi, H; Tachibana, H

    2015-01-01

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine

  3. SU-E-T-50: A Multi-Institutional Study of Independent Dose Verification Software Program for Lung SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa-prefecture (Japan); Takahashi, R; Kamima, T [The Cancer Institute Hospital of JFCR, Koutou-ku, Tokyo (Japan); Baba, H [The National Cancer Center Hospital East, Kashiwa-city, Chiba prefecture (Japan); Yamamoto, T; Kubo, Y [Otemae Hospital, Chuou-ku, Osaka-city (Japan); Ishibashi, S; Higuchi, Y [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Takahashi, H [St Lukes International Hospital, Chuou-ku, Tokyo (Japan); Tachibana, H [National Cancer Center Hospital East, Kashiwa, Chiba (Japan)

    2015-06-15

    Purpose: The accuracy of dose distribution depends on treatment planning system especially in heterogeneity-region. The tolerance level (TL) of the secondary check using the independent dose verification may be variable in lung SBRT plans. We conducted a multi-institutional study to evaluate the tolerance level of lung SBRT plans shown in the AAPM TG114. Methods: Five institutes in Japan participated in this study. All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. Analytical Anisotropic Algorithm (AAA), Pencil Beam Convolution with modified Batho-method (PBC-B) and Adaptive Convolve (AC) were used for lung SBRT planning. A measurement using an ion-chamber was performed in a heterogeneous phantom to compare doses from the three different algorithms and the SMU to the measured dose. In addition to it, a retrospective analysis using clinical lung SBRT plans (547 beams from 77 patients) was conducted to evaluate the confidence limit (CL, Average±2SD) in dose between the three algorithms and the SMU. Results: Compared to the measurement, the AAA showed the larger systematic dose error of 2.9±3.2% than PBC-B and AC. The Clarkson-based SMU showed larger error of 5.8±3.8%. The CLs for clinical plans were 7.7±6.0 % (AAA), 5.3±3.3 % (AC), 5.7±3.4 % (PBC -B), respectively. Conclusion: The TLs from the CLs were evaluated. A Clarkson-based system shows a large systematic variation because of inhomogeneous correction. The AAA showed a significant variation. Thus, we must consider the difference of inhomogeneous correction as well as the dependence of dose calculation engine.

  4. Ostomy Home Skills Program

    Medline Plus

    Full Text Available ... State Requirements Contact Online Education Accreditation, Verification, and Validation Accreditation, Verification, and Validation Programs Accreditation, Verification, and ...

  5. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  6. Performance Assessment and Scooter Verification of Nano-Alumina Engine Oil

    Directory of Open Access Journals (Sweden)

    Yu-Feng Lue

    2016-09-01

    Full Text Available The performance assessment and vehicle verification of nano-alumina (Al2O3 engine oil (NAEO were conducted in this study. The NAEO was produced by mixing Al2O3 nanoparticles with engine oil using a two-step synthesis method. The weight fractions of the Al2O3 nanoparticles in the four test samples were 0 (base oil, 0.5, 1.5, and 2.5 wt. %. The measurement of basic properties included: (1 density; (2 viscosity at various sample temperatures (20–80 °C. A rotary tribology testing machine with a pin-on-disk apparatus was used for the wear test. The measurement of the before-and-after difference of specimen (disk weight (wear test indicates that the NAEO with 1.5 wt. % Al2O3 nanoparticles (1.5 wt. % NAEO was the chosen candidate for further study. For the scooter verification on an auto-pilot dynamometer, there were three tests, including: (1 the European Driving Cycle (ECE40 driving cycle; (2 constant speed (50 km/h; and (3 constant throttle positions (20%, 40%, 60%, and 90%. For the ECE40 driving cycle and the constant speed tests, the fuel consumption was decreased on average by 2.75%, while it was decreased by 3.57% for the constant throttle case. The experimental results prove that the engine oil with added Al2O3 nanoparticles significantly decreased the fuel consumption. In the future, experiments with property tests of other nano-engine oils and a performance assessment of the nano-engine-fuel will be conducted.

  7. The DECADE performance assessment program

    International Nuclear Information System (INIS)

    Weber, B.V.; Ottinger, P.F.; Commisso, R.J.; Thompson, J.; Rowley, J.E.; Filios, P.; Babineau, M.A.

    1996-01-01

    Previous analyses of DECADE Module 1 experiments indicated significant current loss between the plasma opening switch (POS) and an electron-beam load. A program was initiated to diagnose and improve the power flow to assess the performance of a multi-module DECADE system. Power flow measurements between the POS and load indicate high vacuum flow, distributed current loss and azimuthal asymmetries. A decreased load impedance reduces the fraction of the load current flowing in vacuum. Improved plasma source symmetry reduces losses near the load for long conduction times. Increased POS impedance is required to significantly improve the power coupling to the load. (author). 6 figs., 9 refs

  8. The DECADE performance assessment program

    Energy Technology Data Exchange (ETDEWEB)

    Weber, B V; Ottinger, P F; Commisso, R J [Naval Research Lab., Washington, DC (United States). Plasma Physics Div.; Goyer, J R; Kortbawi, D [Physics International Co., Berkeley, CA (United States); Thompson, J [Maxwell Labs., San Diego, CA (United States); Rowley, J E; Filios, P [Defense Nuclear Agency, Alexandria, VA (United States); Babineau, M A [Sverdlup Technology, Tullahoma, TN (United States)

    1997-12-31

    Previous analyses of DECADE Module 1 experiments indicated significant current loss between the plasma opening switch (POS) and an electron-beam load. A program was initiated to diagnose and improve the power flow to assess the performance of a multi-module DECADE system. Power flow measurements between the POS and load indicate high vacuum flow, distributed current loss and azimuthal asymmetries. A decreased load impedance reduces the fraction of the load current flowing in vacuum. Improved plasma source symmetry reduces losses near the load for long conduction times. Increased POS impedance is required to significantly improve the power coupling to the load. (author). 6 figs., 9 refs.

  9. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  10. The Learner Verification of Series r: The New Macmillan Reading Program; Highlights.

    Science.gov (United States)

    National Evaluation Systems, Inc., Amherst, MA.

    National Evaluation Systems, Inc., has developed curriculum evaluation techniques, in terms of learner verification, which may be used to help the curriculum-development efforts of publishing companies, state education departments, and universities. This document includes a summary of the learner-verification approach, with data collected about a…

  11. 78 FR 28812 - Energy Efficiency Program for Industrial Equipment: Petition of UL Verification Services Inc. for...

    Science.gov (United States)

    2013-05-16

    ... are engineers. UL today is comprised of five businesses, Product Safety, Verification Services, Life..., Director--Global Technical Research, UL Verification Services. Subscribed and sworn to before me this 20... (431.447(c)(4)) General Personnel Overview UL is a global independent safety science company with more...

  12. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  13. Programs to improve plant performance

    International Nuclear Information System (INIS)

    Felmus, N.L.

    1987-01-01

    Looking toward the 1990's, we see a period in which our industry will face the challenge of improving the performance of the nuclear plants which are built and operating. The skills and technology are at hand to make good plant performance a reality and we believe the time has come to use them to achieve that end. As reserve margins decline, utilities and their regulators will increasingly seek to tap the unexploited capacity tied up in plants operating below their optimum availability. This paper describes a number of the programs, plant improvements and operations improvements which can yield a significant increase in nuclear plant availability and capacity factor now and into the 1990's. (author)

  14. Performance-based planning and programming guidebook.

    Science.gov (United States)

    2013-09-01

    "Performance-based planning and programming (PBPP) refers to the application of performance management principles within the planning and programming processes of transportation agencies to achieve desired performance outcomes for the multimodal tran...

  15. Development, Verification and Validation of Parallel, Scalable Volume of Fluid CFD Program for Propulsion Applications

    Science.gov (United States)

    West, Jeff; Yang, H. Q.

    2014-01-01

    There are many instances involving liquid/gas interfaces and their dynamics in the design of liquid engine powered rockets such as the Space Launch System (SLS). Some examples of these applications are: Propellant tank draining and slosh, subcritical condition injector analysis for gas generators, preburners and thrust chambers, water deluge mitigation for launch induced environments and even solid rocket motor liquid slag dynamics. Commercially available CFD programs simulating gas/liquid interfaces using the Volume of Fluid approach are currently limited in their parallel scalability. In 2010 for instance, an internal NASA/MSFC review of three commercial tools revealed that parallel scalability was seriously compromised at 8 cpus and no additional speedup was possible after 32 cpus. Other non-interface CFD applications at the time were demonstrating useful parallel scalability up to 4,096 processors or more. Based on this review, NASA/MSFC initiated an effort to implement a Volume of Fluid implementation within the unstructured mesh, pressure-based algorithm CFD program, Loci-STREAM. After verification was achieved by comparing results to the commercial CFD program CFD-Ace+, and validation by direct comparison with data, Loci-STREAM-VoF is now the production CFD tool for propellant slosh force and slosh damping rate simulations at NASA/MSFC. On these applications, good parallel scalability has been demonstrated for problems sizes of tens of millions of cells and thousands of cpu cores. Ongoing efforts are focused on the application of Loci-STREAM-VoF to predict the transient flow patterns of water on the SLS Mobile Launch Platform in order to support the phasing of water for launch environment mitigation so that vehicle determinantal effects are not realized.

  16. The effect of two complexity factors on the performance of emergency tasks-An experimental verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Jung, Kwangtae

    2008-01-01

    It is well known that the use of procedures is very important in securing the safety of process systems, since good procedures effectively guide human operators by providing 'what should be done' and 'how to do it', especially under stressful conditions. At the same time, it has been emphasized that the use of complicated procedures could drastically impair operators' performance. This means that a systematic approach that can properly evaluate the complexity of procedures is indispensable for minimizing the side effects of complicated procedures. For this reason, Park et al. have developed a task complexity measure called TACOM that can be used to quantify the complexity of tasks stipulated in emergency operating procedures (EOPs) of nuclear power plants (NPPs). The TACOM measure consists of five sub-measures that can cover five important factors making the performance of emergency tasks complicated. However, a verification activity for two kinds of complexity factors-the level of abstraction hierarchy (AH) and engineering decision (ED)-seems to be insufficient. In this study, therefore, an experiment is conducted by using a low-fidelity simulator in order to clarify the appropriateness of these complexity factors. As a result, it seems that subjects' performance data are affected by the level of AH as well as ED. Therefore it is anticipate that both the level of AH and ED will play an important role in evaluating the complexity of EOPs

  17. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    seL4 security verification [18] avoids this issue in the same way. In that work, the authors frame their solution as a restriction that disallows...identical: (σ, σ′1) ∈ TM ∧ (σ, σ′2) ∈ TM =⇒ Ol(σ′1) = Ol(σ′2) The successful security verifications of both seL4 and mCertiKOS provide reasonable...evidence that this restriction on specifications is not a major hindrance for usability. Unlike the seL4 verification, however, our framework runs into a

  18. Independent verification of a material balance at a LEU fuel fabrication plant. Program for technical assistance to IAEA safeguards

    International Nuclear Information System (INIS)

    Sorenson, R.J.; McSweeney, T.I.; Hartman, M.G.; Brouns, R.J.; Stewart, K.B.; Granquist, D.P.

    1977-11-01

    This report describes the application of methodology for planning an inspection according to the procedures of the International Atomic Energy Agency (IAEA), and an example evaluation of data representative of low-enriched uranium fuel fabrication facilities. Included are the inspection plan test criteria, the inspection sampling plans, the sample data collected during the inspection, acceptance testing of physical inventories with test equipment, material unaccounted for (MUF) evaluation, and quantitative statements of the results and conclusions that could be derived from the inspection. The analysis in this report demonstrates the application of inspection strategies which produce quantitative results. A facility model was used that is representative of large low-enriched uranium fuel fabrication plants with material flows, inventory sizes, and compositions of material representative of operating commercial facilities. The principal objective was to determine and illustrate the degree of assurance against a diversion of special nuclear materials (SNM) that can be achieved by an inspection and the verification of material flows and inventories. This work was performed as part of the USA program for technical assistance to the IAEA. 10 figs, 14 tables

  19. Performance verification of the Gravity and Extreme Magnetism Small explorer (GEMS) x-ray polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kaneko, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; Marlowe, Hannah; Griffiths, Scott; Kaaret, Philip E.; Kenward, David; Khalid, Syed

    2014-07-01

    Polarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor >=35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, ~20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  20. Performance verification and system parameter identification of spacecraft tape recorder control servo

    Science.gov (United States)

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  1. Performance Verification of the Gravity and Extreme Magnetism Small Explorer GEMS X-Ray Polarimeter

    Science.gov (United States)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kanako, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; hide

    2014-01-01

    olarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor greater than or equal to 35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, approximately 20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  2. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  3. Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification

    Science.gov (United States)

    Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.

    2014-07-01

    In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.

  4. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: MOBILE SOURCE RETROFIT AIR POLLUTION CONTROL DEVICES: CLEAN CLEAR FUEL TECHNOLOGIES, INC.’S, UNIVERSAL FUEL CELL

    Science.gov (United States)

    The U.S. EPA's Office of Research and Development operates the Environmental Technology Verification (ETV) program to facilitate the deployment of innovative technologies through performance verification and information dissemination. Congress funds ETV in response to the belief ...

  7. Calibrations and verifications performed in view of the ILA reinstatement at JET

    Energy Technology Data Exchange (ETDEWEB)

    Dumortier, P., E-mail: pierre.dumortier@rma.ac.be; Durodié, F. [LPP-ERM-KMS, TEC partner, Brussels (Belgium); Helou, W. [CEA, IRFM, F-13108 St-Paul-Lez-Durance (France); Monakhov, I.; Noble, C.; Wooldridge, E.; Blackman, T.; Graham, M. [CCFE, Culham Science Centre, Abingdon (United Kingdom); Collaboration: EUROfusion Consortium

    2015-12-10

    The calibrations and verifications that are performed in preparation of the ITER-Like antenna (ILA) reinstatement at JET are reviewed. A brief reminder of the ILA system layout is given. The different calibration methods and results are then discussed. They encompass the calibrations of the directional couplers present in the system, the determination of the relation between the capacitor position readings and the capacitance value, the voltage probes calibration inside the antenna housing, the RF cables characterization and the acquisition electronics circuit calibration. Earlier experience with the ILA has shown that accurate calibrations are essential for the control of the full ILA close-packed antenna array, its protection through the S-Matrix Arc Detection and the new second stage matching algorithm to be implemented. Finally the voltage stand-off of the capacitors is checked and the phase range achievable with the system is verified. The system layout is modified as to allow dipole operation over the whole operating frequency range when operating with the 3dB combiner-splitters.

  8. SU-F-J-116: Clinical Experience-Based Verification and Improvement of a 4DCT Program

    Energy Technology Data Exchange (ETDEWEB)

    Fogg, P; West, M; Aland, T [Genesis Cancer Care, Auchenflower, Qld (Australia)

    2016-06-15

    Purpose: To demonstrate the role of continuous improvement fulfilled by the Medical Physicist in clinical 4DCT and CBCT scanning. Methods: Lung (SABR and Standard) patients’ 4D respiratory motion and image data were reviewed over a 3, 6 and 12 month period following commissioning testing. By identifying trends of clinically relevant parameters and respiratory motions, variables were tested with a programmable motion phantom and assessed. Patient traces were imported to a motion phantom and 4DCT and CBCT imaging were performed. Cos6 surrogate and sup-inf motion was also programmed into the phantom to simulate the long exhale of patients for image contrast tests. Results: Patient surrogate motion amplitudes were 9.9+5.2mm (3–35) at 18+6bpm (6–30). Expiration/Inspiration time ratios of 1.4+0.5second (0.6–2.9) showed image contrast effects evident in the AveCT and 3DCBCT images. Small differences were found for patients with multiple 4DCT data sets. Patient motion assessments were simulated and verified with the phantom within 2mm. Initial image reviews to check for reconstructed artefacts and data loss identified a small number of patients with irregularities in the automatic placement of inspiration and expiration points. Conclusion: The Physicist’s involvement in the continuous improvements of a clinically commissioned technique, processes and workflows continues beyond the commissioning stage of a project. Our experience with our clinical 4DCT program shows that Physics presence is required at the clinical 4DCT scan to assist with technical aspects of the scan and also for clinical image quality assessment prior to voluming. The results of this work enabled the sharing of information from the Medical Physics group with the Radiation Oncologists and Radiation Therapists. This results in an improved awareness of clinical patient respiration variables and how they may affect 4D simulation images and also may also affect the treatment verification images.

  9. Verification testing of the compression performance of the HEVC screen content coding extensions

    Science.gov (United States)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  10. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    International Nuclear Information System (INIS)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom

    2011-01-01

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  11. Verification Process of Behavioral Consistency between Design and Implementation programs of pSET using HW-CBMC

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Ah; Lee, Jong Hoon; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2011-05-15

    Controllers in safety critical systems such as nuclear power plants often use Function Block Diagrams (FBDs) to design embedded software. The design is implemented using programming languages such as C to compile it into particular target hardware. The implementation must have the same behavior with the design and the behavior should be verified explicitly. For example, the pSET (POSAFE-Q Software Engineering Tool) is a loader software to program POSAFE-Q PLC (Programmable Logic Controller) and is developed as a part of the KNICS (Korea Nuclear Instrumentation and Control System R and D Center) project. It uses FBDs to design software of PLC, and generates ANSI-C code to compile it into specific machine code. To verify the equivalence between the FBDs and ANSI-C code, mathematical proof of code generator or a verification tools such as RETRANS can help guarantee the equivalence. Mathematical proof, however, has a weakness that requires high expenditure and repetitive fulfillment whenever the translator is modified. On the other hand, RETRANS reconstructs the generated source code without consideration of the generator. It has also a weakness that the reconstruction of generated code needs additional analysis This paper introduces verification process of behavioral consistency between design and its implementation of the pSET using the HW-CBMC. The HW-CBMC is a formal verification tool, verifying equivalence between hardware and software description. It requires two inputs for checking equivalence, Verilog for hard-ware and ANSI-C for software. In this approach, FBDs are translated into semantically equivalent Verilog pro-gram, and the HW-CBMC verifies equivalence between the Verilog program and the ANSI-C program which is generated from the FBDs

  12. Initial Clinical Experience Performing Patient Treatment Verification With an Electronic Portal Imaging Device Transit Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Sean L., E-mail: BerryS@MSKCC.org [Department of Applied Physics and Applied Mathematics, Columbia University, New York, New York (United States); Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Polvorosa, Cynthia; Cheng, Simon; Deutsch, Israel; Chao, K. S. Clifford; Wuu, Cheng-Shie [Department of Radiation Oncology, Columbia University, New York, New York (United States)

    2014-01-01

    Purpose: To prospectively evaluate a 2-dimensional transit dosimetry algorithm's performance on a patient population and to analyze the issues that would arise in a widespread clinical adoption of transit electronic portal imaging device (EPID) dosimetry. Methods and Materials: Eleven patients were enrolled on the protocol; 9 completed and were analyzed. Pretreatment intensity modulated radiation therapy (IMRT) patient-specific quality assurance was performed using a stringent local 3%, 3-mm γ criterion to verify that the planned fluence had been appropriately transferred to and delivered by the linear accelerator. Transit dosimetric EPID images were then acquired during treatment and compared offline with predicted transit images using a global 5%, 3-mm γ criterion. Results: There were 288 transit images analyzed. The overall γ pass rate was 89.1% ± 9.8% (average ± 1 SD). For the subset of images for which the linear accelerator couch did not interfere with the measurement, the γ pass rate was 95.7% ± 2.4%. A case study is presented in which the transit dosimetry algorithm was able to identify that a lung patient's bilateral pleural effusion had resolved in the time between the planning CT scan and the treatment. Conclusions: The EPID transit dosimetry algorithm under consideration, previously described and verified in a phantom study, is feasible for use in treatment delivery verification for real patients. Two-dimensional EPID transit dosimetry can play an important role in indicating when a treatment delivery is inconsistent with the original plan.

  13. Verification of data files of TREF-computer program; TREF-ohjelmiston ohjaustiedostojen soveltuvuustutkimus

    Energy Technology Data Exchange (ETDEWEB)

    Ruottu, S.; Halme, A.; Ruottu, A. [Einco Oy, Karhula (Finland)

    1996-12-01

    Originally the aim of Y43 project was to verify TREF data files for several different processes. However, it appeared that deficient or missing coordination between experimental and theoretical works made meaningful verifications impossible in some cases. Therefore verification calculations were focused on catalytic cracking reactor which was developed by Neste. The studied reactor consisted of prefluidisation and reaction zones. Verification calculations concentrated mainly on physical phenomena like vaporization near oil injection zone. The main steps of the cracking process can be described as follows oil (liquid) -> oil (gas) -> oil (catal) -> product (catal) + char (catal) -> product (gas). Catalytic nature of cracking reaction was accounted by defining the cracking pseudoreaction into catalyst phase. This simplified reaction model was valid only for vaporization zone. Applied fluid dynamic theory was based on the results of EINCO`s earlier LIEKKI-projects. (author)

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EXEL INDUSTRIAL AIRMIX SPRAY GUN

    Science.gov (United States)

    The Environmental Technology Verification Program has partnered with Concurrent Technologies Corp. to verify innovative coatings and coating equipment technologies for reducing air emissions. This report describes the performance of EXEL Industrial's Kremlin Airmix high transfer ...

  15. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory's Source Region Program

    Energy Technology Data Exchange (ETDEWEB)

    Whitaker, R.W.; Noel, S.D.

    1992-12-01

    The summary report by Tom Weaver gives the overall background for the series of IVE (Integrated Verification Experiment) experiments including information on the full set of measurements made. This appendix presents details of the infrasound data for the and discusses certain aspects of a few special experiments. Prior to FY90, the emphasis of the Infrasound Program was on underground nuclear test (UGT) detection and yield estimation. During this time the Infrasound Program was a separate program at Los Alamos, and it was suggested to DOE/OAC that a regional infrasound network be established around NTS. The IVE experiments took place in a time frame that allowed simultaneous testing of possible network sites and examination of propagation in different directions. Whenever possible, infrasound stations were combined with seismic stations so that a large number could be efficiently fielded. The regional infrasound network was not pursued by DOE, as world events began to change the direction of verification toward non-proliferation. Starting in FY90 the infrasound activity became part of the Source Region Program which has a goal of understanding how energy is transported from the UGT to a variety of measurement locations.

  16. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V ampersand V) manual. Volume 9

    International Nuclear Information System (INIS)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M.

    1995-03-01

    A verification and validation (V ampersand V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V ampersand V of successive versions of SAPHIRE. Previous efforts have been the V ampersand V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V ampersand V plan is based on the SAPHIRE 4.0 V ampersand V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified

  17. Research on non-uniform strain profile reconstruction along fiber Bragg grating via genetic programming algorithm and interrelated experimental verification

    Science.gov (United States)

    Zheng, Shijie; Zhang, Nan; Xia, Yanjun; Wang, Hongtao

    2014-03-01

    A new heuristic strategy for the non-uniform strain profile reconstruction along Fiber Bragg Gratings is proposed in this paper, which is based on the modified transfer matrix and Genetic Programming(GP) algorithm. The present method uses Genetic Programming to determine the applied strain field as a function of position along the fiber length. The structures that undergo adaptation in genetic programming are hierarchical structures which are different from that of conventional genetic algorithm operating on strings. GP regress the strain profile function which matches the 'measured' spectrum best and makes space resolution of strain reconstruction arbitrarily high, or even infinite. This paper also presents an experimental verification of the reconstruction of non-homogeneous strain fields using GP. The results are compared with numerical calculations of finite element method. Both the simulation examples and experimental results demonstrate that Genetic Programming can effectively reconstruct continuous profile expression along the whole FBG, and greatly improves its computational efficiency and accuracy.

  18. International Performance Measurement & Verification Protocol: Concepts and Practices for Improved Indoor Environmental Quality, Volume II (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  19. Development and Implementation of Cgcre Accreditation Program for Greenhouse Gas Verification Bodies

    International Nuclear Information System (INIS)

    Fermam, Ricardo Kropf Santos; De Queiroz, Andrea Barroso Melo Monteiro

    2016-01-01

    An organizational innovation is defined as the implementation of a new organizational method in the firm's business practices, organization of your workplace or in its external relations. This work illustrates a Cgcre innovation, by presentation of the development process of greenhouse gases verification body in Brazil according to the Brazilian accreditation body, the General Coordination for Accreditation (Cgcre). (paper)

  20. Computer program for regional assessment of lung perfusion defect. Part II - verification of the algorithm

    International Nuclear Information System (INIS)

    Stefaniak, B.

    2002-01-01

    As described earlier, a dedicated computer program was developed for quantitative evaluation of regional lung perfusion defects, visualized by pulmonary scintigraphy. The correctness of the basic assumptions accepted to construct the algorithms and of the all program functions needed to be checked, before application of the program into the clinical routine. The aim of this study was to verified the program using various software instruments and physical models. Evaluation of the proposed method was performed using software procedures, physical lung phantom, and selected lung image.The reproducibility of lung regions, defined by the program was found excellent. No significant distortion of registered data was observed after ROI transformation into the circle and retransformation into the original shape. The obtained results comprised parametric presentation of activity defects as well as a set of numerical indices which defined extent and intensity of decreased counts density. Among these indices PD2 and DM* were proved the most suitable for the above purposes. The obtained results indicate that the algorithms used for the program construction were correct and suitable for the aim of the study. The above algorithms enable function under study to be presented graphically with true imaging of activity distribution, as well as numerical indices, defining extent and intensity of activity defects to calculated. (author)

  1. USING PERFLUOROCARBON TRACERS FOR VERIFICATION OF CAP AND COVER SYSTEMS PERFORMANCE

    International Nuclear Information System (INIS)

    HEISER, J.; SULLIVAN, T.

    2001-01-01

    The Department of Energy (DOE) Environmental Management (EM) office has committed itself to an accelerated cleanup of its national facilities. The goal is to have much of the DOE legacy waste sites remediated by 2006. This includes closure of several sites (e.g., Rocky Flats and Fernald). With the increased focus on accelerated cleanup, there has been considerable concern about long-term stewardship issues in general, and verification and long-term monitoring (LTM) of caps and covers, in particular. Cap and cover systems (covers) are vital remedial options that will be extensively used in meeting these 2006 cleanup goals. Every buried waste site within the DOE complex will require some form of cover system. These covers are expected to last from 100 to 1000 years or more. The stakeholders can be expected to focus on system durability and sustained performance. DOE EM has set up a national committee of experts to develop a long-term capping (LTC) guidance document. Covers are subject to subsidence, erosion, desiccation, animal intrusion, plant root infiltration, etc., all of which will affect the overall performance of the cover. Very little is available in terms of long-term monitoring other than downstream groundwater or surface water monitoring. By its very nature, this can only indicate that failure of the cover system has already occurred and contaminants have been transported away from the site. This is unacceptable. Methods that indicate early cover failure (prior to contaminant release) or predict approaching cover failure are needed. The LTC committee has identified predictive monitoring technologies as a high priority need for DOE, both for new covers as well as existing covers. The same committee identified a Brookhaven National Laboratory (BNL) technology as one approach that may be capable of meeting the requirements for LTM. The Environmental Research and Technology Division (ERTD) at BNL developed a novel methodology for verifying and monitoring

  2. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  3. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  4. Overview of the NRC performance monitoring program

    International Nuclear Information System (INIS)

    Jordan, E.L.

    1987-01-01

    In response to the accident at Three Mile Island, the NRC developed the Systematic Assessment of Licensee Performance (SALP) Program to aid in the identification of those licensees that were more likely than others to have safety problems and to provide a rational basis for allocation of inspection resources. The NRC also has an ongoing program of screening and evaluating operating reactor event reports on a daily basis for promptly identifying safety problems. Although the SALP and event report evaluation programs have been successful in identifying potential performance problems, a concern developed recently about the adequacy and timeliness of NRC programs to detect poor or declining performance. The performance indicator program as approved by the commission is in the implementation phase. The program is expected to undergo refinements as new indicators are developed and experience is gained in the use of indicators

  5. Performance Verification of GOSAT-2 FTS-2 Simulator and Sensitivity Analysis for Greenhouse Gases Retrieval

    Science.gov (United States)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Matsunaga, T.

    2015-12-01

    performance verification of the GOSAT-2 FTS-2 simulator and describe the future prospects for Level 2 retrieval. Besides, we will present the various sensitivity analyses relating to the engineering parameters and the atmospheric conditions on Level 1 processing for greenhouse gases retrieval.

  6. [Verification of doubtful PAP smear results of women included in the screening program in the Podlaskie province].

    Science.gov (United States)

    Błońska, Ewa; Knapp, Piotr Andrzej

    2013-08-01

    Verification of uncertain PAP-smear results in a group of women covered by the cervical screening program in the Podlaski province. The main aim of the study was to identify CIN (Cervical Intraepithelial Neoplasia) lesions present, with varying degrees of severity in women with cytological diagnosis of ASCUS (atypical squamous cells of undetermined significance), LSIL (low grade squamous intraepithelial lesion), and ASC-H (atypical squamous cells - cannot exclude high grade squamous intraepithelial lesion). The study evaluated 101 cervical smears taken from the vaginal part of the cervix in a group of screened women in the Podlaski province. Cytological evaluation was performed according the Bethesda System. We analyzed abnormal smears selected from a total of 7296 cytological examinations performed during 2012 at the University Center for Pathomorphological and Genetic - Molecular Diagnosis, Medical University in Białystok. The cytological results which were of interest to us included 19 cases with ASCUS, 59 with LSIL, and 23 with ASC-H, as well as with morphological features of the presence of Human Papilloma Virus (HPV). Staining was performed using CINtecPLUS test according to the manufacturer's instructions. CINtecPLUS is a immunocytochemical test based on specially designed monoclonal antibodies (E6H4TM) that let us identify protein p16ink4a within the cervical smear Additionally the diagnostic kit was provided with antibodies for diagnosing the presence of Ki-67 protein, a known marker of cell proliferation. The result was considered positive when staining of the nucleus and the cytoplasm appeared in red and brown, respectively. All abnormal results were eventually verified by histological examination of the tissue taken from cervical lesions by diagnostic-therapeutic procedure following colposcopic evaluation of cervical lesion topography In the group of cytological smears with ASCUS, the diagnosis was positive in 5 cases (26.3%), negative in 14 (73

  7. Quality Assurance in Environmental Technology Verification (ETV): Analysis and Impact on the EU ETV Pilot Programme Performance

    Science.gov (United States)

    Molenda, Michał; Ratman-Kłosińska, Izabela

    2018-03-01

    Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.

  8. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    International Nuclear Information System (INIS)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook

    2016-01-01

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results

  9. On the safety and performance demonstration tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and validation and verification of computational codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Bum; Jeong, Ji Young; Lee, Tae Ho; Kim, Sung Kyun; Euh, Dong Jin; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR) has been developed and the validation and verification (V and V) activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1), produced satisfactory results, which were used for the computer codes V and V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs) have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  10. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  11. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  12. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  13. School Breakfast Program and School Performance.

    Science.gov (United States)

    Meyers, Alan; And Others

    Children who participate in the School Breakfast Program show significant improvement in academic performance and tardiness rates, and a trend toward improvement in absenteeism. The School Breakfast Program was created by Congress in 1966 to provide a breakfast on school days for low income children who would otherwise have none. Children…

  14. Provider Customer Service Program - Performance Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS is continuously analyzing performance and quality of the Provider Customer Service Programs (PCSPs) of the contractors and will be identifying trends and making...

  15. Comparison and verification of two computer programs used to analyze ventilation systems under accident conditions

    International Nuclear Information System (INIS)

    Hartig, S.H.; Wurz, D.E.; Arnitz, T.; Ruedinger, V.

    1985-01-01

    Two computer codes, TVENT and EVENT, which were developed at the Los Alamos National Laboratory (LANL) for the analysis of ventilation systems, have been modified to model air-cleaning systems that include active components with time-dependent flow-resistance characteristics. With both modified programs, fluid-dynamic transients were calculated for a test facility used to simulate accident conditions in air-cleaning systems. Experiments were performed in the test facility whereby flow and pressure transients were generated with the help of two quick-actuating air-stream control valves. The numerical calculations are compared with the test results. Although EVENT makes use of a more complex theoretical flow model than TVENT, the numerical simulations of both codes were found to be very similar for the flow conditions studied and to closely follow the experimental results

  16. Alloy development for irradiation performance: program strategy

    International Nuclear Information System (INIS)

    Bloom, E.E.; Stiegler, J.O.; Wiffen, F.W.; Dalder, E.N.C.; Reuther, T.C.; Gold, R.E.; Holmes, J.J.; Kummer, D.L.; Nolfi, F.V.

    1978-01-01

    The objective of the Alloy Development for Irradiation Performance Program is the development of structural materials for use in the first wall and blanket region of fusion reactors. The goal of the program is a material that will survive an exposure of 40 MWyr/m 2 at a temperature which will allow use of a liquid-H 2 O heat transport system. Although the ultimate aim of the program is development of materials for commercial reactors by the end of this century, activities are organized to provide materials data for the relatively low performance interim machines that will precede commercial reactors

  17. Verification and validation of predictive computer programs describing the near and far-field chemistry of radioactive waste disposal systems

    International Nuclear Information System (INIS)

    Read, D.; Broyd, T.W.

    1988-01-01

    This paper provides an introduction to CHEMVAL, an international project concerned with establishing the applicability of chemical speciation and coupled transport models to the simulation of realistic waste disposal situations. The project aims to validate computer-based models quantitatively by comparison with laboratory and field experiments. Verification of the various computer programs employed by research organisations within the European Community is ensured through close inter-laboratory collaboration. The compilation and review of thermodynamic data forms an essential aspect of this work and has led to the production of an internally consistent standard CHEMVAL database. The sensitivity of results to variation in fundamental constants is being monitored at each stage of the project and, where feasible, complementary laboratory studies are used to improve the data set. Currently, thirteen organisations from five countries are participating in CHEMVAL which forms part of the Commission of European Communities' MIRAGE 2 programme of research. (orig.)

  18. Plant performance monitoring program at Krsko NPP

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2004-01-01

    A high level of nuclear safety and plant reliability results from the complex interaction of a good design, operational safety and human performance. This is the reason for establishing a set of operational plant safety performance indicators, to enable monitoring of both plant performance and progress. Performance indicators are also used for setting challenging targets and goals for improvement, to gain additional perspective on performance relative to other plants and to provide an indication of a potential need to adjust priorities and resources to achieve improved overall plant performance. A specific indicator trend over a certain period can provide an early warning to plant management to evaluate the causes behind the observed changes. In addition to monitoring the changes and trends, it is also necessary to compare the indicators with identified targets and goals to evaluate performance strengths and weaknesses. Plant Performance Monitoring Program at Krsko NPP defines and ensures consistent collection, processing, analysis and use of predefined relevant plant operational data, providing a quantitative indication of nuclear power plant performance. When the program was developed, the conceptual framework described in IAEA TECDOC-1141 Operational Safety Performance Indicators for Nuclear Power Plants was used as its basis in order to secure that a reasonable set of quantitative indications of operational safety performance would be established. Safe, conservative, cautious and reliable operation of the Krsko NPP is a common goal for all plant personnel. It is provided by continuous assurance of both health and safety of the public and employees according to the plant policy stated in program MD-1 Notranje usmeritve in cilji NEK, which is the top plant program. Establishing a program of monitoring and assessing operational plant safety performance indicators represents effective safety culture of plant personnel.(author)

  19. Cell verification of parallel burnup calculation program MCBMPI based on MPI

    International Nuclear Information System (INIS)

    Yang Wankui; Liu Yaoguang; Ma Jimin; Wang Guanbo; Yang Xin; She Ding

    2014-01-01

    The parallel burnup calculation program MCBMPI was developed. The program was modularized. The parallel MCNP5 program MCNP5MPI was employed as neutron transport calculation module. And a composite of three solution methods was used to solve burnup equation, i.e. matrix exponential technique, TTA analytical solution, and Gauss Seidel iteration. MPI parallel zone decomposition strategy was concluded in the program. The program system only consists of MCNP5MPI and burnup subroutine. The latter achieves three main functions, i.e. zone decomposition, nuclide transferring and decaying, and data exchanging with MCNP5MPI. Also, the program was verified with the pressurized water reactor (PWR) cell burnup benchmark. The results show that it,s capable to apply the program to burnup calculation of multiple zones, and the computation efficiency could be significantly improved with the development of computer hardware. (authors)

  20. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  1. General-Purpose Heat Source Safety Verification Test program: Edge-on flyer plate tests

    International Nuclear Information System (INIS)

    George, T.G.

    1987-03-01

    The radioisotope thermoelectric generator (RTG) that will supply power for the Galileo and Ulysses space missions contains 18 General-Purpose Heat Source (GPHS) modules. The GPHS modules provide power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Each module contains four 238 PuO 2 -fueled clads and generates 250 W(t). Because the possibility of a launch vehicle explosion always exists, and because such an explosion could generate a field of high-energy fragments, the fueled clads within each GPHS module must survive fragment impact. The edge-on flyer plate tests were included in the Safety Verification Test series to provide information on the module/clad response to the impact of high-energy plate fragments. The test results indicate that the edge-on impact of a 3.2-mm-thick, aluminum-alloy (2219-T87) plate traveling at 915 m/s causes the complete release of fuel from capsules contained within a bare GPHS module, and that the threshold velocity sufficient to cause the breach of a bare, simulant-fueled clad impacted by a 3.5-mm-thick, aluminum-alloy (5052-T0) plate is approximately 140 m/s

  2. Integration of KESS III models in ATHLET-CD and contributions to program verification. Final report

    International Nuclear Information System (INIS)

    Bruder, M.; Schatz, A.

    1994-07-01

    The development of the computer code ATHLET-CD is a contribution to the reactor safety research. ATHLET-CD is an extension of the system code ATHLET by core degradation models especially of the modular software package KESS. The aim of the ATHLET-CD development is the simulation of severe accident sequences from their initialisation to severe core degradation in a continous manner. In the framework of this project the ATHLET-CD development has been focused on the integration of KESS model like the control rod model as well as the models describing chemical interactions and material relocation along a rod and fission product release. The present ATHLET-CD version is able to describe severe accidents in a PWR up to the early core degradation (relocation of material along a rod surface in axial direction). Contributions to the verification of ATHLET-CD comprised calculations of the experiments PHEBUS AIC and PBF SFD 1-4. The PHEBUS AIC calculation was focused on the examination of the control rod model whereas the PBF SFD 1-4 claculation served to check the models describing melting, material relocation and fission product release. (orig.)

  3. Elasto-plastic benchmark calculations. Step 1: verification of the numerical accuracy of the computer programs

    International Nuclear Information System (INIS)

    Corsi, F.

    1985-01-01

    In connection with the design of nuclear reactors components operating at elevated temperature, design criteria need a level of realism in the prediction of inelastic structural behaviour. This concept leads to the necessity of developing non linear computer programmes, and, as a consequence, to the problems of verification and qualification of these tools. Benchmark calculations allow to carry out these two actions, involving at the same time an increased level of confidence in complex phenomena analysis and in inelastic design calculations. With the financial and programmatic support of the Commission of the European Communities (CEE) a programme of elasto-plastic benchmark calculations relevant to the design of structural components for LMFBR has been undertaken by those Member States which are developing a fast reactor project. Four principal progressive aims were initially pointed out that brought to the decision to subdivide the Benchmark effort in a calculations series of four sequential steps: step 1 to 4. The present document tries to summarize Step 1 of the Benchmark exercise, to derive some conclusions on Step 1 by comparison of the results obtained with the various codes and to point out some concluding comments on the first action. It is to point out that even if the work was designed to test the capabilities of the computer codes, another aim was to increase the skill of the users concerned

  4. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  5. Development and benchmark verification of a parallelized Monte Carlo burnup calculation program MCBMPI

    International Nuclear Information System (INIS)

    Yang Wankui; Liu Yaoguang; Ma Jimin; Yang Xin; Wang Guanbo

    2014-01-01

    MCBMPI, a parallelized burnup calculation program, was developed. The program is modularized. Neutron transport calculation module employs the parallelized MCNP5 program MCNP5MPI, and burnup calculation module employs ORIGEN2, with the MPI parallel zone decomposition strategy. The program system only consists of MCNP5MPI and an interface subroutine. The interface subroutine achieves three main functions, i.e. zone decomposition, nuclide transferring and decaying, data exchanging with MCNP5MPI. Also, the program was verified with the Pressurized Water Reactor (PWR) cell burnup benchmark, the results showed that it's capable to apply the program to burnup calculation of multiple zones, and the computation efficiency could be significantly improved with the development of computer hardware. (authors)

  6. Experimental verification of photon: A program for use in x-ray shielding calculations

    International Nuclear Information System (INIS)

    Brauer, E.; Thomlinson, W.

    1987-01-01

    At the National Synchrotron Light Source, a computer program named PHOTON has been developed to calculate radiation dose values around a beam line. The output from the program must be an accurate guide to beam line shielding. To test the program, a series of measurements of radiation dose were carried out using existing beam lines; the results were compared to the theoretical calculations of PHOTON. Several different scattering geometries, scattering materials, and sets of walls and shielding materials were studied. Results of the measurements allowed many advances to be made in the program, ultimately resulting in good agreement between the theory and experiment. 3 refs., 6 figs

  7. On the Safety and Performance Demonstration Tests of Prototype Gen-IV Sodium-Cooled Fast Reactor and Validation and Verification of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jong-Bum Kim

    2016-10-01

    Full Text Available The design of Prototype Gen-IV Sodium-Cooled Fast Reactor (PGSFR has been developed and the validation and verification (V&V activities to demonstrate the system performance and safety are in progress. In this paper, the current status of test activities is described briefly and significant results are discussed. The large-scale sodium thermal-hydraulic test program, Sodium Test Loop for Safety Simulation and Assessment-1 (STELLA-1, produced satisfactory results, which were used for the computer codes V&V, and the performance test results of the model pump in sodium showed good agreement with those in water. The second phase of the STELLA program with the integral effect tests facility, STELLA-2, is in the detailed design stage of the design process. The sodium thermal-hydraulic experiment loop for finned-tube sodium-to-air heat exchanger performance test, the intermediate heat exchanger test facility, and the test facility for the reactor flow distribution are underway. Flow characteristics test in subchannels of a wire-wrapped rod bundle has been carried out for safety analysis in the core and the dynamic characteristic test of upper internal structure has been performed for the seismic analysis model for the PGSFR. The performance tests for control rod assemblies (CRAs have been conducted for control rod drive mechanism driving parts and drop tests of the CRA under scram condition were performed. Finally, three types of inspection sensors under development for the safe operation of the PGSFR were explained with significant results.

  8. Ostomy Home Skills Program

    Medline Plus

    Full Text Available ... Special Activities Resources Housing and Travel ... Contact Online Education Accreditation, Verification, and Validation Accreditation, Verification, and Validation Programs Accreditation, Verification, and ...

  9. Scheduler-specific Confidentiality for Multi-Threaded Programs and Its Logic-Based Verification

    NARCIS (Netherlands)

    Huisman, Marieke; Ngo, Minh Tri

    2011-01-01

    Observational determinism has been proposed in the literature as a way to ensure confidentiality for multi-threaded programs. Intuitively, a program is observationally deterministic if the behavior of the public variables is deterministic, i.e., independent of the private variables and the

  10. Scheduler-Specific Confidentiality for Multi-Threaded Programs and Its Logic-Based Verification

    NARCIS (Netherlands)

    Huisman, Marieke; Ngo, Minh Tri; Beckert, B.; Damiani, F.; Gurov, D.

    2012-01-01

    Observational determinism has been proposed in the literature as a way to ensure condentiality for multi-threaded programs. Intuitively, a program is observationally deterministic if the behavior of the public variables is deterministic, i.e., independent of the private variables and the scheduling

  11. Intel Xeon Phi coprocessor high performance programming

    CERN Document Server

    Jeffers, James

    2013-01-01

    Authors Jim Jeffers and James Reinders spent two years helping educate customers about the prototype and pre-production hardware before Intel introduced the first Intel Xeon Phi coprocessor. They have distilled their own experiences coupled with insights from many expert customers, Intel Field Engineers, Application Engineers and Technical Consulting Engineers, to create this authoritative first book on the essentials of programming for this new architecture and these new products. This book is useful even before you ever touch a system with an Intel Xeon Phi coprocessor. To ensure that your applications run at maximum efficiency, the authors emphasize key techniques for programming any modern parallel computing system whether based on Intel Xeon processors, Intel Xeon Phi coprocessors, or other high performance microprocessors. Applying these techniques will generally increase your program performance on any system, and better prepare you for Intel Xeon Phi coprocessors and the Intel MIC architecture. It off...

  12. State-of-the-art report for the testing and formal verification methods for FBD program

    International Nuclear Information System (INIS)

    Jee, Eun Kyoung; Lee, Jang Soo; Lee, Young Jun; Yoo, Jun Beom

    2011-10-01

    The importance of PLC testing has increased in the nuclear I and C domain. While regulation authorities require both functional and structural testing for safety system software, FBD testing relies only on functional testing and there has been little research on structural testing techniques for FBD programs. We aim to analyze current techniques related to FBD testing and develop a structural testing technique appropriate to FBD programs. We developed structural test coverage criteria applicable to FBD programs, focusing on data paths from input edges to output edges of FBD programs. A data path condition (DPC), under which input data can flow into the output edge, is defined for each data path. We defined basic coverage, input condition coverage and complex condition coverage criteria based on the formal definition of DPC. We also developed a measurement procedure for FBD testing adequacy and a supporting tool prototype

  13. TFE Verification Program: Semiannual report for the period ending March 31, 1987

    International Nuclear Information System (INIS)

    1987-04-01

    The objective of the TFE program is to demonstrate the technological readiness of a thermionic fuel element suitable for use as the basic element in a thermionic reactor with electric power output in the .5 to 5.0 MWe range, with a full-power life of 7 years. This report summarizes the technical results obtained in this program. Information presented here contains evaluated test data, designs, and experimental results

  14. Structural performance evaluation on aging underground reinforced concrete structures. Part 6. An estimation method of threshold value in performance verification taking reinforcing steel corrosion

    International Nuclear Information System (INIS)

    Matsuo, Toyofumi; Matsumura, Takuro; Miyagawa, Yoshinori

    2009-01-01

    This paper discusses applicability of material degradation model due to reinforcing steel corrosion for RC box-culverts with corroded reinforcement and an estimation method for threshold value in performance verification reflecting reinforcing steel corrosion. First, in FEM analyses, loss of reinforcement section area and initial tension strain arising from reinforcing steel corrosion, and deteriorated bond characteristics between reinforcement and concrete were considered. The full-scale loading tests using corroded RC box-culverts were numerically analyzed. As a result, the analyzed crack patterns and load-strain relationships were in close agreement with the experimental results within the maximum corrosion ratio 15% of primary reinforcement. Then, we showed that this modeling could estimate the load carrying capacity of corroded RC box-culverts. Second, a parametric study was carried out for corroded RC box culverts with various sizes, reinforcement ratios and levels of steel corrosion, etc. Furthermore, as an application of analytical results and various experimental investigations, we suggested allowable degradation ratios for a modification of the threshold value, which corresponds to the chloride induced deterioration progress that is widely accepted in maintenance practice for civil engineering reinforced concrete structures. Finally, based on these findings, we developed two estimation methods for threshold value in performance verification: 1) a structural analysis method using nonlinear FEM included modeling of material degradation, 2) a practical method using a threshold value, which is determined by structural analyses of RC box-culverts in sound condition, is multiplied by the allowable degradation ratio. (author)

  15. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  16. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  17. Behavioral patterns of environmental performance evaluation programs.

    Science.gov (United States)

    Li, Wanxin; Mauerhofer, Volker

    2016-11-01

    During the past decades numerous environmental performance evaluation programs have been developed and implemented on different geographic scales. This paper develops a taxonomy of environmental management behavioral patterns in order to provide a practical comparison tool for environmental performance evaluation programs. Ten such programs purposively selected are mapped against the identified four behavioral patterns in the form of diagnosis, negotiation, learning, and socialization and learning. Overall, we found that schemes which serve to diagnose environmental abnormalities are mainly externally imposed and have been developed as a result of technical debates concerning data sources, methodology and ranking criteria. Learning oriented scheme is featured by processes through which free exchange of ideas, mutual and adaptive learning can occur. Scheme developed by higher authority for influencing behaviors of lower levels of government has been adopted by the evaluated to signal their excellent environmental performance. The socializing and learning classified evaluation schemes have incorporated dialogue, participation, and capacity building in program design. In conclusion we consider the 'fitness for purpose' of the various schemes, the merits of our analytical model and the future possibilities of fostering capacity building in the realm of wicked environmental challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Fault detection and initial state verification by linear programming for a class of Petri nets

    Science.gov (United States)

    Rachell, Traxon; Meyer, David G.

    1992-01-01

    The authors present an algorithmic approach to determining when the marking of a LSMG (live safe marked graph) or a LSFC (live safe free choice) net is in the set of live safe markings M. Hence, once the marking of a net is determined to be in M, then if at some time thereafter the marking of this net is determined not to be in M, this indicates a fault. It is shown how linear programming can be used to determine if m is an element of M. The worst-case computational complexity of each algorithm is bounded by the number of linear programs necessary to compute.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - TETRATEC PTFE TECHNOLOGIES TETRATEX 8005

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  20. Verification of the calculation program for brachytherapy planning system of high dose rate (PLATO)

    International Nuclear Information System (INIS)

    Almansa, J.; Alaman, C.; Perez-Alija, J.; Herrero, C.; Real, R. del; Ososrio, J. L.

    2011-01-01

    In our treatments are performed brachytherapy high dose rate since 2007. The procedures performed include gynecological intracavitary treatment and interstitial. The treatments are performed with a source of Ir-192 activity between 5 and 10 Ci such that small variations in treatment times can cause damage to the patient. In addition the Royal Decree 1566/1998 on Quality Criteria in radiotherapy establishes the need to verify the monitor units or treatment time in radiotherapy and brachytherapy. All this justifies the existence of a redundant system for brachytherapy dose calculation that can reveal any abnormality is present.

  1. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-04-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes. Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance. Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies. Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software. Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance. Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At

  2. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    Science.gov (United States)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  3. U.S. experience in seismic re-evaluation and verification programs

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1995-01-01

    The purpose of this paper is to present a summary of the development of a seismic re-evaluation program for older nuclear power plants in the U.S. The principal focus of this reevaluation is the use of actual strong motion earthquake response data for structures and mechanical and electrical systems and components. These data are supplemented by generic shake table test results. Use of this type of seismic re-evaluation has led to major cost reductions as compared to more conventional analytical and component specific testing procedures. (author)

  4. Seismic Evaluation of a Multitower Connected Building by Using Three Software Programs with Experimental Verification

    Directory of Open Access Journals (Sweden)

    Deyuan Zhou

    2016-01-01

    Full Text Available Shanghai International Design Center (SHIDC is a hybrid structure of steel frame and reinforced concrete core tube (SF-RCC. It is a building of unequal height two-tower system and the story lateral stiffness of two towers is different, which may result in the torsion effect. To fully evaluate structural behaviors of SHIDC under earthquakes, NosaCAD, ABAQUS, and Perform-3D, which are widely applied for nonlinear structure analysis, were used to perform elastoplastic time history analyses. Numerical results were compared with those of shake table testing. NosaCAD has function modules for transforming the nonlinear analysis model to Perform-3D and ABAQUS. These models were used in ABAQUS or Perform-3D directly. With the model transformation, seismic performances of SHIDC were fully investigated. Analyses have shown that the maximum interstory drift can satisfy the limits specified in Chinese code and the failure sequence of structural members was reasonable. It meant that the earthquake input energy can be well dissipated. The structure keeps in an undamaged state under frequent earthquakes and it does not collapse under rare earthquakes; therefore, the seismic design target is satisfied. The integrated use of multisoftware with the validation of shake table testing provides confidence for a safe design of such a complex structure.

  5. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  6. Identification and verification of critical performance dimensions. Phase 1 of the systematic process redesign of drug distribution.

    Science.gov (United States)

    Colen, Hadewig B; Neef, Cees; Schuring, Roel W

    2003-06-01

    Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.

  7. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  8. 2011 Annual Criticality Safety Program Performance Summary

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Hoffman

    2011-12-01

    The 2011 review of the INL Criticality Safety Program has determined that the program is robust and effective. The review was prepared for, and fulfills Contract Data Requirements List (CDRL) item H.20, 'Annual Criticality Safety Program performance summary that includes the status of assessments, issues, corrective actions, infractions, requirements management, training, and programmatic support.' This performance summary addresses the status of these important elements of the INL Criticality Safety Program. Assessments - Assessments in 2011 were planned and scheduled. The scheduled assessments included a Criticality Safety Program Effectiveness Review, Criticality Control Area Inspections, a Protection of Controlled Unclassified Information Inspection, an Assessment of Criticality Safety SQA, and this management assessment of the Criticality Safety Program. All of the assessments were completed with the exception of the 'Effectiveness Review' for SSPSF, which was delayed due to emerging work. Although minor issues were identified in the assessments, no issues or combination of issues indicated that the INL Criticality Safety Program was ineffective. The identification of issues demonstrates the importance of an assessment program to the overall health and effectiveness of the INL Criticality Safety Program. Issues and Corrective Actions - There are relatively few criticality safety related issues in the Laboratory ICAMS system. Most were identified by Criticality Safety Program assessments. No issues indicate ineffectiveness in the INL Criticality Safety Program. All of the issues are being worked and there are no imminent criticality concerns. Infractions - There was one criticality safety related violation in 2011. On January 18, 2011, it was discovered that a fuel plate bundle in the Nuclear Materials Inspection and Storage (NMIS) facility exceeded the fissionable mass limit, resulting in a technical safety requirement (TSR) violation. The

  9. Performance expectations of measurement control programs

    International Nuclear Information System (INIS)

    Hammond, G.A.

    1985-01-01

    The principal index for designing and assessing the effectiveness of safeguards is the sensitivity and reliability of gauging the true status of material balances involving material flows, transfers, inventories, and process holdup. The measurement system must not only be capable of characterizing the material for gradation or intensity of protection, but also be responsive to needs for detection and localization of losses, provide confirmation that no diversion has occurred, and help meet requirements for process control, health and safety. Consequently, the judicious application of a measurement control and quality assurance program is vital to a complete understanding of the capabilities and limitations of the measurement system including systematic and random components of error for weight, volume, sampling, chemical, isotopic, and nondestructive determinations of material quantities in each material balance area. This paper describes performance expectations or criteria for a measurement control program in terms of ''what'' is desired and ''why'', relative to safeguards and security objectives

  10. Analytical Performance Verification of FCS-MPC Applied to Power Electronic Converters

    DEFF Research Database (Denmark)

    Novak, Mateja; Dragicevic, Tomislav; Blaabjerg, Frede

    2017-01-01

    Since the introduction of finite control set model predictive control (FCS-MPC) in power electronics the algorithm has been missing an important aspect that would speed up its implementation in industry: a simple method to verify the algorithm performance. This paper proposes to use a statistical...... model checking (SMC) method for performance evaluation of the algorithm applied to power electronics converters. SMC is simple to implement, intuitive and it requires only an operational model of the system that can be simulated and checked against properties. Device under test for control algorithm...

  11. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  12. Improving Speaker Verification Performance in Presence of Spoofing Attacks Using Out-of-Domain Spoofed Data

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Sahidullah, Md; Tan, Zheng-Hua

    2017-01-01

    of the two systems is challenging and often leads to increased false rejection rates. Furthermore, the performance of CM severely degrades if in-domain development data are unavailable. In this study, therefore, we propose a solution that uses two separate background models – one from human speech...

  13. Design and performance verification of advanced multistage depressed collectors. [traveling wave tubes for ECM

    Science.gov (United States)

    Kosmahl, H.; Ramins, P.

    1975-01-01

    Design and performance of a small size, 4-stage depressed collector are discussed. The collector and a spent beam refocusing section preceding it are intended for efficiency enhancement of octave bandwidth, high CW power traveling wave tubes for use in ECM.

  14. Camera calibration in a hazardous environment performed in situ with automated analysis and verification

    International Nuclear Information System (INIS)

    DePiero, F.W.; Kress, R.L.

    1993-01-01

    Camera calibration using the method of Two Planes is discussed. An implementation of the technique is described that may be performed in situ, e.g., in a hazardous or contaminated environment, thus eliminating the need for decontamination of camera systems before recalibration. Companion analysis techniques used for verifying the correctness of the calibration are presented

  15. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  16. Description and performance characteristics for the neutron Coincidence Collar for the verification of reactor fuel assemblies

    International Nuclear Information System (INIS)

    Menlove, H.O.

    1981-08-01

    An active neutron interrogation method has been developed for the measurement of 235 U content in fresh fuel assemblies. The neutron Coincidence Collar uses neutron interrogation with an AmLi neutron source and coincidence counting the induced fission reaction neutrons from the 235 U. This manual describes the system components, operation, and performance characteristics. Applications of the Coincidence Collar to PWR and BWR types of reactor fuel assemblies are described

  17. Modeling and experimental verification of proof mass effects on vibration energy harvester performance

    International Nuclear Information System (INIS)

    Kim, Miso; Hoegen, Mathias; Dugundji, John; Wardle, Brian L

    2010-01-01

    An electromechanically coupled model for a cantilevered piezoelectric energy harvester with a proof mass is presented. Proof masses are essential in microscale devices to move device resonances towards optimal frequency points for harvesting. Such devices with proof masses have not been rigorously modeled previously; instead, lumped mass or concentrated point masses at arbitrary points on the beam have been used. Thus, this work focuses on the exact vibration analysis of cantilevered energy harvester devices including a tip proof mass. The model is based not only on a detailed modal analysis, but also on a thorough investigation of damping ratios that can significantly affect device performance. A model with multiple degrees of freedom is developed and then reduced to a single-mode model, yielding convenient closed-form normalized predictions of device performance. In order to verify the analytical model, experimental tests are undertaken on a macroscale, symmetric, bimorph, piezoelectric energy harvester with proof masses of different geometries. The model accurately captures all aspects of the measured response, including the location of peak-power operating points at resonance and anti-resonance, and trends such as the dependence of the maximal power harvested on the frequency. It is observed that even a small change in proof mass geometry results in a substantial change of device performance due not only to the frequency shift, but also to the effect on the strain distribution along the device length. Future work will include the optimal design of devices for various applications, and quantification of the importance of nonlinearities (structural and piezoelectric coupling) for device performance

  18. Lay out, test verification and in orbit performance of HELIOS a temperature control system

    Science.gov (United States)

    Brungs, W.

    1975-01-01

    HELIOS temperature control system is described. The main design features and the impact of interactions between experiment, spacecraft system, and temperature control system requirements on the design are discussed. The major limitations of the thermal design regarding a closer sun approach are given and related to test experience and performance data obtained in orbit. Finally the validity of the test results achieved with prototype and flight spacecraft is evaluated by comparison between test data, orbit temperature predictions and flight data.

  19. Presentation and verification of a simple mathematical model foridentification of the areas behind noise barrierwith the highest performance

    Directory of Open Access Journals (Sweden)

    M. Monazzam

    2009-07-01

    Full Text Available Background and aims   Traffic noise barriers are the most important measure to control the environmental noise pollution. Diffraction from top edge of noise barriers is the most important path of indirect sound wave moves towards receiver.Therefore, most studies are focused on  improvement of this kind.   Methods   T-shape profile barriers are one of the most successful barrier among many different profiles. In this investigation the theory of destructive effect of diffracted waves from real edge of barrier and the wave diffracted from image of the barrier with phase difference of radians is used. Firstly a simple mathematical representation of the zones behind rigid and absorbent T- shape barriers with the highest insertion loss using the destructive effect of indirect path via barrier  image is introduced and then two different profile reflective and absorption barrier is used for  verification of the introduced model   Results   The results are then compared with the results of a verified two dimensional boundary element method at 1/3 octave band frequencies and in a wide field behind those barriers. Avery good agreement between the results has been achieved. In this method effective height is used for any different profile barriers.   Conclusion   The introduced model is very simple, flexible and fast and could be used for choosing the best location of profile rigid and absorptive barriers to achieve the highest  performance.  

  20. Verification of the 2.00 WAPPA-B [Waste Package Performance Assessment-B version] code

    International Nuclear Information System (INIS)

    Tylock, B.; Jansen, G.; Raines, G.E.

    1987-07-01

    The old version of the Waste Package Performance Assessment (WAPPA) code has been modified into a new code version, 2.00 WAPPA-B. The input files and the results for two benchmarks at repository conditions are fully documented in the appendixes of the EA reference report. The 2.00 WAPPA-B version of the code is suitable for computation of barrier failure due to uniform corrosion; however, an improved sub-version, 2.01 WAPPA-B, is recommended for general use due to minor errors found in 2.00 WAPPA-B during its verification procedures. The input files and input echoes have been modified to include behavior of both radionuclides and elements, but the 2.00 WAPPA-B version of the WAPPA code is not recommended for computation of radionuclide releases. The 2.00 WAPPA-B version computes only mass balances and the initial presence of radionuclides that can be released. Future code development in the 3.00 WAPPA-C version will include radionuclide release computations. 19 refs., 10 figs., 1 tab

  1. Performance in the WIPP nondestructive assay performance demonstration program

    Energy Technology Data Exchange (ETDEWEB)

    Marcinkiewicz, C.J. [Consolidated Technical Services, Inc., Frederick, MD (United States); Connolly, M.J.; Becker, G.K. [Lockheed Martin Idaho Technologies Company, Idaho Falls, ID (United States)

    1997-11-01

    Measurement facilities performing nondestructive assay (NDA) of wastes intended for disposal at the United States Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) are required to demonstrate their ability to meet specific Quality Assurance Objectives (QAOs). This demonstration is performed, in part, by participation in the NDA Performance Demonstration Program (PDP). The PDP is funded and managed by the Carlsbad Area Office (CAO) of DOE and is conducted by the Idaho National Engineering Laboratory. It tests the characteristics of precision, system bias and/or total uncertainty through the measurement of variable, blind combinations of simulated waste drums and certified radioactive standards. Each facility must successfully participate in the PDP using each different type of measurement system planned for use in waste characterization. The first cycle of the PDP using each different type of measurement system planned for use in waste characterization. The first cycle of the PDP was completed in July 1996 and the second is scheduled for completion by December 1996. Seven sites reported data in cycle 1 for 11 different measurement systems. This paper describes the design and operation of the PDP and provides the performance data from cycle 1. It also describes the preliminary results from cycle 2 and updates the status and future plans for the NDA PDP. 4 refs., 9 figs., 11 tabs.

  2. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  3. Assessment and Verification of SLS Block 1-B Exploration Upper Stage State and Stage Disposal Performance

    Science.gov (United States)

    Patrick, Sean; Oliver, Emerson

    2018-01-01

    One of the SLS Navigation System's key performance requirements is a constraint on the payload system's delta-v allocation to correct for insertion errors due to vehicle state uncertainty at payload separation. The SLS navigation team has developed a Delta-Delta-V analysis approach to assess the effect on trajectory correction maneuver (TCM) design needed to correct for navigation errors. This approach differs from traditional covariance analysis based methods and makes no assumptions with regard to the propagation of the state dynamics. This allows for consideration of non-linearity in the propagation of state uncertainties. The Delta-Delta-V analysis approach re-optimizes perturbed SLS mission trajectories by varying key mission states in accordance with an assumed state error. The state error is developed from detailed vehicle 6-DOF Monte Carlo analysis or generated using covariance analysis. These perturbed trajectories are compared to a nominal trajectory to determine necessary TCM design. To implement this analysis approach, a tool set was developed which combines the functionality of a 3-DOF trajectory optimization tool, Copernicus, and a detailed 6-DOF vehicle simulation tool, Marshall Aerospace Vehicle Representation in C (MAVERIC). In addition to delta-v allocation constraints on SLS navigation performance, SLS mission requirement dictate successful upper stage disposal. Due to engine and propellant constraints, the SLS Exploration Upper Stage (EUS) must dispose into heliocentric space by means of a lunar fly-by maneuver. As with payload delta-v allocation, upper stage disposal maneuvers must place the EUS on a trajectory that maximizes the probability of achieving a heliocentric orbit post Lunar fly-by considering all sources of vehicle state uncertainty prior to the maneuver. To ensure disposal, the SLS navigation team has developed an analysis approach to derive optimal disposal guidance targets. This approach maximizes the state error covariance prior

  4. Assessment and Verification of SLS Block 1-B Exploration Upper Stage and Stage Disposal Performance

    Science.gov (United States)

    Patrick, Sean; Oliver, T. Emerson; Anzalone, Evan J.

    2018-01-01

    Delta-v allocation to correct for insertion errors caused by state uncertainty is one of the key performance requirements imposed on the SLS Navigation System. Additionally, SLS mission requirements include the need for the Exploration Up-per Stage (EUS) to be disposed of successfully. To assess these requirements, the SLS navigation team has developed and implemented a series of analysis methods. Here the authors detail the Delta-Delta-V approach to assessing delta-v allocation as well as the EUS disposal optimization approach.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, CUMMINS EMISSION SOLUTIONS AND CUMMINS FILTRATION DIESEL OXIDATION CATALYST AND CLOSED CRANKCASE VENTILATION SYSTEM

    Science.gov (United States)

    The U.S. EPA has created the Environmental Technology Verification (ETV) Program. ETV seeks to provide high-quality, peer-reviewed data on technology performance. The Air Pollution Control Technology (APCT) Verification Center, a center under the ETV Program, is operated by Res...

  6. Grazing Incidence Wavefront Sensing and Verification of X-Ray Optics Performance

    Science.gov (United States)

    Saha, Timo T.; Rohrbach, Scott; Zhang, William W.

    2011-01-01

    Evaluation of interferometrically measured mirror metrology data and characterization of a telescope wavefront can be powerful tools in understanding of image characteristics of an x-ray optical system. In the development of soft x-ray telescope for the International X-Ray Observatory (IXO), we have developed new approaches to support the telescope development process. Interferometrically measuring the optical components over all relevant spatial frequencies can be used to evaluate and predict the performance of an x-ray telescope. Typically, the mirrors are measured using a mount that minimizes the mount and gravity induced errors. In the assembly and mounting process the shape of the mirror segments can dramatically change. We have developed wavefront sensing techniques suitable for the x-ray optical components to aid us in the characterization and evaluation of these changes. Hartmann sensing of a telescope and its components is a simple method that can be used to evaluate low order mirror surface errors and alignment errors. Phase retrieval techniques can also be used to assess and estimate the low order axial errors of the primary and secondary mirror segments. In this paper we describe the mathematical foundation of our Hartmann and phase retrieval sensing techniques. We show how these techniques can be used in the evaluation and performance prediction process of x-ray telescopes.

  7. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  8. Confidence Intervals Verification for Simulated Error Rate Performance of Wireless Communication System

    KAUST Repository

    Smadi, Mahmoud A.

    2012-12-06

    In this paper, we derived an efficient simulation method to evaluate the error rate of wireless communication system. Coherent binary phase-shift keying system is considered with imperfect channel phase recovery. The results presented demonstrate the system performance under very realistic Nakagami-m fading and additive white Gaussian noise channel. On the other hand, the accuracy of the obtained results is verified through running the simulation under a good confidence interval reliability of 95 %. We see that as the number of simulation runs N increases, the simulated error rate becomes closer to the actual one and the confidence interval difference reduces. Hence our results are expected to be of significant practical use for such scenarios. © 2012 Springer Science+Business Media New York.

  9. Analytical model for performance verification of liquid poison injection system of a nuclear reactor

    International Nuclear Information System (INIS)

    Kansal, Anuj Kumar; Maheshwari, Naresh Kumar; Vijayan, Pallippattu Krishnan

    2014-01-01

    Highlights: • One-dimensional modelling of shut down system-2. • Semi-empirical correlation poison jet progression. • Validation of code. - Abstract: Shut down system-2 (SDS-2) in advanced vertical pressure tube type reactor, provides rapid reactor shutdown by high pressure injection of a neutron absorbing liquid called poison, into the moderator in the calandria. Poison inside the calandria is distributed by poison jets issued from holes provided in the injection tubes. Effectiveness of the system depends on the rate and spread of the poison in the moderator. In this study, a transient one-dimensional (1D) hydraulic code, COPJET is developed, to predict the performance of system by predicting progression of poison jet with time. Validation of the COPJET is done with the data available in literature. Thereafter, it is applied for advanced vertical pressure type reactor

  10. Front-End Electronics for Verification Measurements: Performance Evaluation and Viability of Advanced Tamper Indicating Measures

    International Nuclear Information System (INIS)

    Smith, E.; Conrad, R.; Morris, S.; Ramuhalli, P.; Sheen, D.; Schanfein, M.; Ianakiev, K.; Browne, M.; Svoboda, J.

    2015-01-01

    The International Atomic Energy Agency (IAEA) continues to expand its use of unattended, remotely monitored measurement systems. An increasing number of systems and an expanding family of instruments create challenges in terms of deployment efficiency and the implementation of data authentication measures. A collaboration between Pacific Northwest National Laboratory (PNNL), Idaho National Laboratory (INL), and Los Alamos National Laboratory (LANL) is working to advance the IAEA's capabilities in these areas. The first objective of the project is to perform a comprehensive evaluation of a prototype front-end electronics package, as specified by the IAEA and procured from a commercial vendor. This evaluation begins with an assessment against the IAEA's original technical specifications and expands to consider the strengths and limitations over a broad range of important parameters that include: sensor types, cable types, and the spectrum of industrial electromagnetic noise that can degrade signals from remotely located detectors. A second objective of the collaboration is to explore advanced tamper-indicating (TI) measures that could help to address some of the long-standing data authentication challenges with IAEA's unattended systems. The collaboration has defined high-priority tampering scenarios to consider (e.g., replacement of sensor, intrusion into cable), and drafted preliminary requirements for advanced TI measures. The collaborators are performing independent TI investigations of different candidate approaches: active time-domain reflectometry (PNNL), passive noise analysis (INL), and pulse-by-pulse analysis and correction (LANL). The initial investigations focus on scenarios where new TI measures are retrofitted into existing IAEA UMS deployments; subsequent work will consider the integration of advanced TI methods into new IAEA UMS deployments where the detector is separated from the front-end electronics. In this paper, project progress

  11. Performance test and verification of an off-the-shelf automated avian radar tracking system.

    Science.gov (United States)

    May, Roel; Steinheim, Yngve; Kvaløy, Pål; Vang, Roald; Hanssen, Frank

    2017-08-01

    Microwave radar is an important tool for observation of birds in flight and represents a tremendous increase in observation capability in terms of amount of surveillance space that can be covered at relatively low cost. Based on off-the-shelf radar hardware, automated radar tracking systems have been developed for monitoring avian movements. However, radar used as an observation instrument in biological research has its limitations that are important to be aware of when analyzing recorded radar data. This article describes a method for exploring the detection capabilities of a dedicated short-range avian radar system used inside the operational Smøla wind-power plant. The purpose of the testing described was to find the maximum detection range for various sized birds, while controlling for the effects of flight tortuosity, flight orientation relative to the radar and ground clutter. The method was to use a dedicated test target in form of a remotely controlled unmanned aerial vehicle (UAV) with calibrated radar cross section (RCS), which enabled the design of virtually any test flight pattern within the area of interest. The UAV had a detection probability of 0.5 within a range of 2,340 m from the radar. The detection performance obtained by the RCS-calibrated test target (-11 dBm 2 , 0.08 m 2 RCS) was then extrapolated to find the corresponding performance of differently sized birds. Detection range depends on system sensitivity, the environment within which the radar is placed and the spatial distribution of birds. The avian radar under study enables continuous monitoring of bird activity within a maximum range up to 2 km dependent on the size of the birds in question. While small bird species may be detected up to 0.5-1 km, larger species may be detected up to 1.5-2 km distance from the radar.

  12. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  13. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  14. Laboratory Testing and Performance Verification of the CHARIS Integral Field Spectrograph

    Science.gov (United States)

    Groff, Tyler D.; Chilcote, Jeffrey; Kasdin, N. Jeremy; Galvin, Michael; Loomis, Craig; Carr, Michael A.; Brandt, Timothy; Knapp, Gillian; Limbach, Mary Anne; Guyon, Olivier; hide

    2016-01-01

    The Coronagraphic High Angular Resolution Imaging Spectrograph (CHARIS) is an integral field spectrograph (IFS) that has been built for the Subaru telescope. CHARIS has two imaging modes; the high-resolution mode is R82, R69, and R82 in J, H, and K bands respectively while the low-resolution discovery mode uses a second low-resolution prism with R19 spanning 1.15-2.37 microns (J+H+K bands). The discovery mode is meant to augment the low inner working angle of the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) adaptive optics system, which feeds CHARIS a coronagraphic image. The goal is to detect and characterize brown dwarfs and hot Jovian planets down to contrasts five orders of magnitude dimmer than their parent star at an inner working angle as low as 80 milliarcseconds. CHARIS constrains spectral crosstalk through several key aspects of the optical design. Additionally, the repeatability of alignment of certain optical components is critical to the calibrations required for the data pipeline. Specifically the relative alignment of the lens let array, prism, and detector must be highly stable and repeatable between imaging modes. We report on the measured repeatability and stability of these mechanisms, measurements of spectral crosstalk in the instrument, and the propagation of these errors through the data pipeline. Another key design feature of CHARIS is the prism, which pairs Barium Fluoride with Ohara L-BBH2 high index glass. The dispersion of the prism is significantly more uniform than other glass choices, and the CHARIS prisms represent the first NIR astronomical instrument that uses L-BBH2as the high index material. This material choice was key to the utility of the discovery mode, so significant efforts were put into cryogenic characterization of the material. The final performance of the prism assemblies in their operating environment is described in detail. The spectrograph is going through final alignment, cryogenic cycling, and is being

  15. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    Science.gov (United States)

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  16. Performance verification of network function virtualization in software defined optical transport networks

    Science.gov (United States)

    Zhao, Yongli; Hu, Liyazhou; Wang, Wei; Li, Yajie; Zhang, Jie

    2017-01-01

    With the continuous opening of resource acquisition and application, there are a large variety of network hardware appliances deployed as the communication infrastructure. To lunch a new network application always implies to replace the obsolete devices and needs the related space and power to accommodate it, which will increase the energy and capital investment. Network function virtualization1 (NFV) aims to address these problems by consolidating many network equipment onto industry standard elements such as servers, switches and storage. Many types of IT resources have been deployed to run Virtual Network Functions (vNFs), such as virtual switches and routers. Then how to deploy NFV in optical transport networks is a of great importance problem. This paper focuses on this problem, and gives an implementation architecture of NFV-enabled optical transport networks based on Software Defined Optical Networking (SDON) with the procedure of vNFs call and return. Especially, an implementation solution of NFV-enabled optical transport node is designed, and a parallel processing method for NFV-enabled OTN nodes is proposed. To verify the performance of NFV-enabled SDON, the protocol interaction procedures of control function virtualization and node function virtualization are demonstrated on SDON testbed. Finally, the benefits and challenges of the parallel processing method for NFV-enabled OTN nodes are simulated and analyzed.

  17. arXiv Performance verification of the CMS Phase-1 Upgrade Pixel detector

    CERN Document Server

    Veszpremi, Viktor

    2017-12-04

    The CMS tracker consists of two tracking systems utilizing semiconductor technology: the inner pixel and the outer strip detectors. The tracker detectors occupy the volume around the beam interaction region between 3 cm and 110 cm in radius and up to 280 cm along the beam axis. The pixel detector consists of 124 million pixels, corresponding to about 2 m 2 total area. It plays a vital role in the seeding of the track reconstruction algorithms and in the reconstruction of primary interactions and secondary decay vertices. It is surrounded by the strip tracker with 10 million read-out channels, corresponding to 200 m 2 total area. The tracker is operated in a high-occupancy and high-radiation environment established by particle collisions in the LHC . The current strip detector continues to perform very well. The pixel detector that has been used in Run 1 and in the first half of Run 2 was, however, replaced with the so-called Phase-1 Upgrade detector. The new system is better suited to match the increased inst...

  18. SU-E-T-350: Verification of Gating Performance of a New Elekta Gating Solution: Response Kit and Catalyst System

    Energy Technology Data Exchange (ETDEWEB)

    Xie, X; Cao, D; Housley, D; Mehta, V; Shepard, D [Swedish Cancer Institute, Seattle, WA (United States)

    2014-06-01

    Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-moving platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.

  19. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  20. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  1. Development of an Enhanced Payback Function for the Superior Energy Performance Program

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter; Rao, Prakash; McKane, Aimee; Sabouni, Ridah; Sheihing, Paul

    2015-08-03

    The U.S. DOE Superior Energy Performance (SEP) program provides recognition to industrial and commercial facilities that achieve certification to the ISO 50001 energy management system standard and third party verification of energy performance improvements. Over 50 industrial facilities are participating and 28 facilities have been certified in the SEP program. These facilities find value in the robust, data driven energy performance improvement result that the SEP program delivers. Previous analysis of SEP certified facility data demonstrated the cost effectiveness of SEP and identified internal staff time to be the largest cost component related to SEP implementation and certification. This paper analyzes previously reported and newly collected data of costs and benefits associated with the implementation of an ISO 50001 and SEP certification. By disaggregating “sunk energy management system (EnMS) labor costs”, this analysis results in a more accurate and detailed understanding of the costs and benefits of SEP participation. SEP is shown to significantly improve and sustain energy performance and energy cost savings, resulting in a highly attractive return on investment. To illustrate these results, a payback function has been developed and is presented. On average facilities with annual energy spend greater than $2M can expect to implement SEP with a payback of less than 1.5 years. Finally, this paper also observes and details decreasing facility costs associated with implementing ISO 50001 and certifying to the SEP program, as the program has improved from pilot, to demonstration, to full launch.

  2. Hanford Site performance summary: EM funded programs

    International Nuclear Information System (INIS)

    Edwards, C.

    1995-09-01

    Hanford performance at fiscal year end reflects a three percent unfavorable schedule variance ($46.3 million*) which was an improvement over August 1995 ($46.3 million for September versus $65.9 million for August) and is below established reporting thresholds (greater than 3 percent). The majority of the behind schedule condition (53 percent) is attributed to EM-40 (Office of Environmental Restoration [ER]) and is a result of late receipt of funds, procurement delays, and US Army Corps of Engineers (USACE) work planned but not accomplished. Other primary contributors to the behind schedule condition are associated with tank farm upgrades, high-level waste disposal and work for others (support to the US Department of Energy-Headquarters [DOE-HQ]). The remaining behind schedule condition is distributed throughout the remaining Hanford programs and do not share common causes. A breakdown of individuals listed on page 8

  3. [Determinants of task preferences when performance is indicative of individual characteristics: self-assessment motivation and self-verification motivation].

    Science.gov (United States)

    Numazaki, M; Kudo, E

    1995-04-01

    The present study was conducted to examine determinants of information-gathering behavior with regard to one's own characteristics. Four tasks with different self-congruent and incongruent diagnosticity were presented to subjects. As self-assessment theory predicted, high diagnostic tasks were preferred to low tasks. And as self-verification theory predicted, self-congruent diagnosticity had a stronger effect on task preference than self-incongruent diagnosticity. In addition, subjects who perceived the relevant characteristics important inclined to choose self-assessment behavior more than who did not. Also, subjects who were certain of their self-concept inclined to choose self-verification behavior more than who were not. These results suggest that both self-assessment and self-verification motivations play important roles in information-gathering behavior regarding one's characteristics, and strength of the motivations is determined by the importance of relevant characteristics or the certainty of self-concept.

  4. International Performance Measurement and Verification Protocol: Concepts and Options for Determining Energy and Water Savings, Volume I (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    2002-03-01

    This protocol serves as a framework to determine energy and water savings resulting from the implementation of an energy efficiency program. It is also intended to help monitor the performance of renewable energy systems and to enhance indoor environmental quality in buildings.

  5. ANIMAL WASTE IMPACT ON SOURCE WATERSAIDED BY EPA/NSF ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) SOURCE WATER PROTECTION PILOT

    Science.gov (United States)

    The Environmental Technology Verification Program (ETV) was established in 1995 by the U.S. Environmental Protection Agency to encourage the development and commercialization of new environmental technologies through third part testing and reporting of performance data. By ensur...

  6. Comparability of the performance of in-line computer vision for geometrical verification of parts, produced by Additive Manufacturing

    DEFF Research Database (Denmark)

    Pedersen, David B.; Hansen, Hans N.

    2014-01-01

    The field of Additive Manufacturing is growing at an accelerated rate, as prototyping is left in favor of direct manufacturing of components for the industry and consumer. A consequence of masscustomization and component complexity is an adverse geometrical verification challenge. Mass...

  7. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    Science.gov (United States)

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  8. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  9. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  10. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  11. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  12. Human Performance Westinghouse Program; Programa Human Performance de Westinghouse

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Gutierrez, A.; Gil, C.

    2010-07-01

    The objective of the Program consists in the excellence actuation, achieving the client success with a perfect realisation project. This program consists of different basic elements to reduce the human mistakes: the HuP tools, coaching, learning clocks and Know website. There is, too, a document file to consult and practice. All these elements are expounded in this paper.

  13. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  14. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  15. Finite element program ARKAS: verification for IAEA benchmark problem analysis on core-wide mechanical analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    ''ARKAS'' code verification, with the problems set in the International Working Group on Fast Reactors (IWGFR) Coordinated Research Programme (CRP) on the inter-comparison between liquid metal cooled fast breeder reactor (LMFBR) Core Mechanics Codes, is discussed. The CRP was co-ordinated by the IWGFR around problems set by Dr. R.G. Anderson (UKAEA) and arose from the IWGFR specialists' meeting on The Predictions and Experience of Core Distortion Behaviour (ref. 2). The problems for the verification (''code against code'') and validation (''code against experiment'') were set and calculated by eleven core mechanics codes from nine countries. All the problems have been completed and were solved with the core structural mechanics code ARKAS. Predictions by ARKAS agreed very well with other solutions for the well-defined verification problems. For the validation problems based on Japanese ex-reactor 2-D thermo-elastic experiments, the agreements between measured and calculated values were fairly good. This paper briefly describes the numerical model of the ARKAS code, and discusses some typical results. (author)

  16. Assessing the Costs and Benefits of the Superior Energy Performance Program

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter; McKane, Aimee; Sabouini, Ridah; Evans, Tracy

    2013-07-01

    Industrial companies are seeking to manage energy consumption and costs, mitigate risks associated with energy, and introduce transparency into reports of their energy performance achievements. Forty industrial facilities are participating in the U.S. DOE supported Superior Energy Performance (SEP) program in which facilities implement an energy management system based on the ISO 50001 standard, and pursue third-party verification of their energy performance improvements. SEP certification provides industrial facilities recognition for implementing a consistent, rigorous, internationally recognized business process for continually improving energy performance and achievement of established energy performance improvement targets. This paper focuses on the business value of SEP and ISO 50001, providing an assessment of the costs and benefits associated with SEP implementation at nine SEP-certified facilities across a variety of industrial sectors. These cost-benefit analyses are part of the U.S. DOE?s contribution to the Global Superior Energy Performance (GSEP) partnership, a multi-country effort to demonstrate, using facility data, that energy management system implementation enables companies to improve their energy performance with a greater return on investment than business-as-usual (BAU) activity. To examine the business value of SEP certification, interviews were conducted with SEP-certified facilities. The costs of implementing the SEP program, including internal facility staff time, are described and a marginal payback of SEP certification has been determined. Additionally, more qualitative factors with regard to the business value and challenges related to SEP and ISO 50001 implementation are summarized.

  17. The Skills Enhancement Training Program. Performance Report.

    Science.gov (United States)

    Food and Beverage Workers Union, Local 32, Washington, DC.

    This report describes a joint labor-management workplace literacy program called SET (Skills Enhancement Training) that targeted the more than 2,000 unionized employees of food service contractors at U.S. government institutions in Washington, D.C. Nineteen classes were offered and a total of 191 people self-selected themselves into the program.…

  18. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  19. A Preliminary Verification and Validation (V and V) Methodology for the Artifacts Programmed with a Hardware Description Language (HDL)

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Keum, Jong Yong; Park, Je Youn; Jo, Ki Ho; Jo, Chang Whan

    2008-01-01

    Nowadays, the FPGA (Field Programmable Gate Array) is widely used in various fields of industry. The FPGA was evolved from the technology of PLD (Programmable Logic Device). The FPGA provides more logic gates than the PLD, which integrates millions of programmable logic gates into a chip. It also provides a massive, fast and reliable processing performance. So, we can integrate a system's functions into a FPGA, which can be a SoC (System on Chip). Furthermore, we can make a FPGA-based DSP, which DSP functions are implemented with a FPGA. With these merits, the FPGA is also used in the nuclear industry. For example, the safety-critical I and C component is manufactured with the FPGA. The FPGA is programmed with a HDL. The quality of the artifacts programmed with a HDL can impact on the quality of a FPGA. When a hazard fault exists in the artifact of a FPGA and is activated during its operation, an accident caused by the fault in the FPGA occurs. So, it is necessary to ensure the quality of the artifacts. This paper, for the purpose of applying it to the SMART (System-integrated Modular Advanced ReacTor) MMIS project, is to present a preliminary V and V methodology for HDL programmed artifacts. For this, we reviewed the following items: - Characteristics of HDL programming - Applicable requirements for a HDL program used for the safety-critical systems - Fault modes of a FPGA Based on the review, we establish the preliminary V and V methodology

  20. School Breakfast Program and School Performance

    OpenAIRE

    J Gordon Millichap

    1989-01-01

    The effects of participation in the school breakfast program by low income children on academic achievement and rates of absence and tardiness are reported from the Department of Pediatrics, Boston City Hospital, Boston, MA.

  1. Soil conservation: Market failure and program performance

    OpenAIRE

    Paul Gary Wyckoff

    1983-01-01

    An examination of the economic rationale behind soil conservation programs, an assessment of the magnitude of the soil erosion problem, and an evaluation of the effectiveness of U.S. soil conservation policies.

  2. School Breakfast Program and school performance.

    Science.gov (United States)

    Meyers, A F; Sampson, A E; Weitzman, M; Rogers, B L; Kayne, H

    1989-10-01

    To test the hypothesis that participation in the School Breakfast Program by low-income children is associated with improvements in standardized achievement test scores and in rates of absence and tardiness, children in grades 3 through 6 were studied in the Lawrence, Mass, public schools, where the School Breakfast Program was begun at the start of the second semester 1986-1987 school year. The changes in scores on a standardized achievement test and in rates of absence and tardiness before and after the implementation of the School Breakfast Program for children participating in the program were compared with those of children who also qualified but did not participate. Controlling for other factors, participation in the School Breakfast Program contributed positively to the 1987 Comprehensive Tests of Basic Skills battery total scale score and negatively to 1987 tardiness and absence rates. These findings suggest that participation in the School Breakfast Program is associated with significant improvements in academic functioning among low-income elementary school children.

  3. Investigation and Verification of the Aerodynamic Performance of a Fan/Booster with Through-flow Method

    Science.gov (United States)

    Liu, Xiaoheng; Jin, Donghai; Gui, Xingmin

    2018-04-01

    Through-flow method is still widely applied in the revolution of the design of a turbomachinery, which can provide not merely the performance characteristic but also the flow field. In this study, a program based on the through-flow method was proposed, which had been verified by many other numerical examples. So as to improve the accuracy of the calculation, abundant loss and deviation models dependent on the real geometry of engine were put into use, such as: viscous losses, overflow in gaps, leakage from a flow path through seals. By means of this program, the aerodynamic performance of a certain high through-flow commercial fan/booster was investigated. On account of the radial distributions of the relevant parameters, flow deterioration in this machine was speculated. To confirm this surmise, 3-D numerical simulation was carried out with the help of the NUMECA software. Through detailed analysis, the speculation above was demonstrated, which provide sufficient evidence for the conclusion that the through-flow method is an essential and effective method for the performance prediction of the fan/booster.

  4. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D.; Wiehagen, J.

    2014-07-01

    Winchester/Camberley Homes with the Building America program and its NAHB Research Center Industry Partnership collaborated to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone four and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, this report demonstrates through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the heating, cooling, air distribution, and ventilation systems intended to optimize the equipment size and configuration to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  5. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert (Oak Ridge National Laboratory, Oak Ridge, TN); McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  6. Enhanced human performance of utility maintenance programs

    International Nuclear Information System (INIS)

    Fresco, A.; Haber, S.; O'Brien, J.

    1993-01-01

    Assuring the safe operation of a nuclear power plant depends, to a large extent, on how effectively one understands and manages the aging-related degradation that occurs in structures, systems, and components (SSCs). Aging-related degradation is typically managed through a nuclear plant's maintenance program. A review of 44 Maintenance Team Inspection (MTI) Reports indicated that while some plant organizations appeared to assume a proactive mode in preventing aging-related failures of their SSCs important to safety, others seemed to be taking a passive or reactive mode. Across all plants, what is clearly needed, is a strong recognition of the importance of aging-related degradation and the use of existing organizational assets to effectively detect and mitigate those effects. Many of those assets can be enhanced by the consideration of organizational and management factors necessary for the implementation of an effective aging management program. This report provides a discussion of this program

  7. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  8. 45 CFR 1183.40 - Monitoring and reporting program performance.

    Science.gov (United States)

    2010-10-01

    ... FOUNDATION ON THE ARTS AND THE HUMANITIES INSTITUTE OF MUSEUM AND LIBRARY SERVICES UNIFORM ADMINISTRATIVE... must cover each program, function or activity. (b) Nonconstruction performance reports. The Federal...

  9. Performance improvement program: goals and experience

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, F. [Point Lepreau Generating Station, Maces Bay, New Brunswick (Canada)

    2015-07-01

    Following long 54 month refurbishment outage at Point Lepreau Generating Station, operational performance had fallen below industry standards in a number of areas. Leadership development and succession planning had stalled. Operational focus was low primarily due to the construction focus during refurbishment. Condition of balance of plant was poor including several long standing deficiencies. In order to improve performance, the site implemented a framework based on INPO 12-011: Focus on Improving Behaviours; Set common goals and demonstrate results; Align and engage the organization; Drive to achieve high levels of performance and sustain performance.

  10. Performance improvement program: goals and experience

    International Nuclear Information System (INIS)

    Guglielmi, F.

    2015-01-01

    Following long 54 month refurbishment outage at Point Lepreau Generating Station, operational performance had fallen below industry standards in a number of areas. Leadership development and succession planning had stalled. Operational focus was low primarily due to the construction focus during refurbishment. Condition of balance of plant was poor including several long standing deficiencies. In order to improve performance, the site implemented a framework based on INPO 12-011: Focus on Improving Behaviours; Set common goals and demonstrate results; Align and engage the organization; Drive to achieve high levels of performance and sustain performance.

  11. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  12. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.

  13. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  14. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  15. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  16. Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.

    Science.gov (United States)

    Parkland Coll., Champaign, IL.

    A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…

  17. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO HYDROGEN SULFIDE ANALYZERS: HORIBA INSTRUMENTS, INC., APSA-360 AND TELEDYNE-API MODEL 101E

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  18. Performance Verification of Production-Scalable Energy-Efficient Solutions: Winchester/Camberley Homes Mixed-Humid Climate

    Energy Technology Data Exchange (ETDEWEB)

    Mallay, D. [Partnership for Home Innovation, Upper Marlboro, MD (United States); Wiehagen, J. [Partnership for Home Innovation, Upper Marlboro, MD (United States)

    2014-07-01

    Winchester/Camberley Homes collaborated with the Building America team Partnership for Home Innovation to develop a new set of high performance home designs that could be applicable on a production scale. The new home designs are to be constructed in the mixed humid climate zone and could eventually apply to all of the builder's home designs to meet or exceed future energy codes or performance-based programs. However, the builder recognized that the combination of new wall framing designs and materials, higher levels of insulation in the wall cavity, and more detailed air sealing to achieve lower infiltration rates changes the moisture characteristics of the wall system. In order to ensure long term durability and repeatable successful implementation with few call-backs, the project team demonstrated through measured data that the wall system functions as a dynamic system, responding to changing interior and outdoor environmental conditions within recognized limits of the materials that make up the wall system. A similar investigation was made with respect to the complete redesign of the HVAC systems to significantly improve efficiency while maintaining indoor comfort. Recognizing the need to demonstrate the benefits of these efficiency features, the builder offered a new house model to serve as a test case to develop framing designs, evaluate material selections and installation requirements, changes to work scopes and contractor learning curves, as well as to compare theoretical performance characteristics with measured results.

  19. Verification of the HDR-test V44 using the computer program RALOC-MOD1/83

    International Nuclear Information System (INIS)

    Jahn, H.; Pham, T. v.; Weber, G.; Pham, B.T.

    1985-01-01

    RALOC-MOD1/83 was extended by a drainage and sump level modul and several component models to serve as a containment systems code for various LWR types. One such application is to simulate the blowdown in a full pressure containment which is important for the short and long term hydrogen distribution. The post test calculation of the containment standard problem experiment HDR-V44 shows a good agreement, to the test data. The code may be used for short and long term predictions, but it was learned that double containments need the representation of the gap between the inner and outer shell into several zones to achieve a good long-term temperature prediction. The present work completes the development, verification and documentation of RALOC-MOD1. (orig.) [de

  20. Performance evaluation and dose verification of the low dose rate permanent prostrate brachytherapy system at the korle-bu Teaching Hospital

    International Nuclear Information System (INIS)

    Asenso, Y.A.

    2015-07-01

    .55 % respectively. That of the physical and internal grid alignment yielded a maximum discrepancy of 2.67 ± 0.01 mm at position 6A on the template. The probe retraction test produced no discrepancies in the “clicks” and corresponding distances. Meanwhile the depth of penetration and axial and lateral resolution test at the time performing the tests were no available standard measurements for comparison. The dose verification test consisted of three tests, the calibration point test, the source strength verification and the TPS dose verification. The calibration point test indicated that distance for maximum ionization chamber sensitivity is 3cm, so seeds can be calibrated at this point. The source strength verification were within the tolerances recommended by ICRU report 38 (ICRU, 1985). The average source strength measured was 0.651450 U ± 0.001052 U deviating from the manufacturer value of 0.64989 U by 0.242 % ± 0. 164 %. The TPS dose verification test produced results with significant errors which occurred due to post irradiation development of film with time but the doses obtained by both TPS and film followed the same pattern. The outcome of the performance evaluations indicate that for patient work, the ultrasound system and prostate brachytherapy system can provide the mechanism for accurate positioning of the brachytherapy seeds facilitating reliable identification of the target volume for accurate effective treatment. (au)

  1. Development and verification of an excel program for calculation of monitor units for tangential breast irradiation with external photon beams

    International Nuclear Information System (INIS)

    Woldemariyam, M.G.

    2015-07-01

    The accuracy of MU calculation performed with Prowess Panther TPS (for Co-60) and Oncentra (for 6MV and 15MV x-rays) for tangential breast irradiation was evaluated with measurements made in an anthropomorphic phantom using calibrated Gafchromic EBT2 films. Excel programme which takes in to account external body surface irregularity of an intact breast or chest wall (hence absence of full scatter condition) using Clarkson’s sector summation technique was developed. A single surface contour of the patient obtained in a transverse plane containing the MU calculation point was required for effective implementation of the programme. The outputs of the Excel programme were validated with the respective outputs from the 3D treatment planning systems. The variations between the measured point doses and their calculated counterparts by the TPSs were within the range of -4.74% to 4.52% (mean of -1.33% and SD of 2.69) for the prowess panther TPS and -4.42% to 3.14% (mean of -1.47% and SD of -3.95) for the Oncentra TPS. The observed degree of deviation may be attributed to limitations of the dose calculation algorithm within the TPSs, set up inaccuracies of the phantom during irradiation and inherent uncertainties associated with radiochromic film dosimetry. The percentage deviations between MUs calculated with the two TPSs and the Excel program were within the range of -3.45% and 3.82% (mean of 0.83% and SD of 2.25). The observed percentage deviations are within the 4% action level recommended by TG-114. This indicates that the Excel program can be confidently employed for calculation of MUs for 2D planned tangential breast irradiations or to independently verify MUs calculated with another calculation methods. (au)

  2. Dynamic Performance Tuning Supported by Program Specification

    Directory of Open Access Journals (Sweden)

    Eduardo César

    2002-01-01

    Full Text Available Performance analysis and tuning of parallel/distributed applications are very difficult tasks for non-expert programmers. It is necessary to provide tools that automatically carry out these tasks. These can be static tools that carry out the analysis on a post-mortem phase or can tune the application on the fly. Both kind of tools have their target applications. Static automatic analysis tools are suitable for stable application while dynamic tuning tools are more appropriate to applications with dynamic behaviour. In this paper, we describe KappaPi as an example of a static automatic performance analysis tool, and also a general environment based on parallel patterns for developing and dynamically tuning parallel/distributed applications.

  3. Performance Demonstration Program Plan for the WIPP Experimental-Waste Characterization Program

    International Nuclear Information System (INIS)

    1991-02-01

    The Performance Demonstration Program is designed to ensure that compliance with the Quality Assurance Objective, identified in the Quality Assurance Program Plan for the WIPP Experimental-Waste Characterization Program (QAPP), is achieved. This Program Plan is intended for use by the WPO to assess the laboratory support provided for the characterization of WIPP TRU waste by the storage/generator sites. Phase 0 of the Performance Demonstration Program encompasses the analysis of headspace gas samples for inorganic and organic components. The WPO will ensure the implementation of this plan by designating an independent organization to coordinate and provide technical oversight for the program (Program Coordinator). Initial program support, regarding the technical oversight and coordination functions, shall be provided by the USEPA-ORP. This plan identifies the criteria that will be used for the evaluation of laboratory performance, the responsibilities of the Program Coordinator, and the responsibilities of the participating laboratories. 5 tabs

  4. Development of the Performance Confirmation Program at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    G.D. LeCain; D. Barr; D. Weaver; R. Snell; S.W. Goodin; F.D. Hansen

    2006-01-01

    The Yucca Mountain Performance Confirmation program consists of tests, monitoring activities, experiments, and analyses to evaluate the adequacy of assumptions, data, and analyses that form the basis of the conceptual and numerical models of flow and transport associated with a proposed radioactive waste repository at Yucca Mountain, Nevada. The Performance Confirmation program uses an eight-stage risk-informed, performance-based approach. Selection of the Performance Confirmation activities (a parameter and a test method) for inclusion in the Performance Confirmation program was done using a risk-informed performance-based decision analysis. The result of this analysis and review was a Performance Confirmation base portfolio that consists of 20 activities. The 20 Performance Confirmation activities include geologic, hydrologic, and construction/engineering testing. Several of the activities were initiated during site characterization and are ongoing. Others activities will commence during construction and/or post emplacement and will continue until repository closure

  5. Development of advanced earthquake resistant performance verification on reinforced concrete underground structures. Pt. 3. Applicability of soil-structure Interaction analysis using nonlinear member model

    International Nuclear Information System (INIS)

    Matsui, Jun; Ohtomo, Keizo; Kawai, Tadashi; Kanatani, Mamoru; Matsuo, Toyofumi

    2003-01-01

    The objective of this study is to obtain verification data concerning performance of RC duct-type underground structures subject to strong earth quakes. This paper presents the investigated results of numerical simulation obtained from shaking table tests of box-type structure models with a scale of about 1/2. We proposed practical nonlinear member models, by which mechanical properties of RC member and soil are defined as hysteresis models (RC: axial force dependent degrading tri-linear model, soil: modified Ramberg-Osgood model), and joint elements are used to evaluate the interaction along the interface of two materials between soil and RC structures; including the slippage and separation. Consequently, the proposed models could simulate the test results on the deformation of soil and RC structure, as well as damage of RC structures which is important in verifying their seismic performance with practical accuracy. (author)

  6. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  7. Performance evaluation of a distance learning program.

    OpenAIRE

    Dailey, D. J.; Eno, K. R.; Brinkley, J. F.

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macint...

  8. Performance evaluation of a distance learning program.

    Science.gov (United States)

    Dailey, D J; Eno, K R; Brinkley, J F

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.

  9. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  10. Performance Assessment Strategy Plan for the Geologic Repository Program

    International Nuclear Information System (INIS)

    1990-01-01

    Performance assessment is a major constituent of the program being conducted by the US Department of Energy (DOE) to develop a geologic repository. Performance assessment is the set of activities needed for quantitative evaluations to assess compliance with the performance requirements in the regulations for a geologic repository and to support the development of the repository. The strategy for these evaluations has been documented in the Performance Assessment Strategy Plan (DOE, 1989). The implementation of the performance assessment strategy is defined in this document. This paper discusses the scope and objectives of the implementation plan, the relationship of the plan to other program plans, summarizes the performance assessment areas and the integrated strategy of the performance assessment program. 1 fig., 3 tabs

  11. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  12. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  13. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  14. 75 FR 9544 - Inmate Work and Performance Pay Program

    Science.gov (United States)

    2010-03-03

    ... inmate may receive performance pay only for that portion of the month that the inmate was working... Inmate Work and Performance Pay Program AGENCY: Bureau of Prisons, Justice. ACTION: Proposed rule... work and performance pay by removing redundant language and provisions that relate solely to staff...

  15. Determination of Safety Performance Grade of NPP Using Integrated Safety Performance Assessment (ISPA) Program

    International Nuclear Information System (INIS)

    Chung, Dae Wook

    2011-01-01

    Since the beginning of 2000, the safety regulation of nuclear power plant (NPP) has been challenged to be conducted more reasonable, effective and efficient way using risk and performance information. In the United States, USNRC established Reactor Oversight Process (ROP) in 2000 for improving the effectiveness of safety regulation of operating NPPs. The main idea of ROP is to classify the NPPs into 5 categories based on the results of safety performance assessment and to conduct graded regulatory programs according to categorization, which might be interpreted as 'Graded Regulation'. However, the classification of safety performance categories is highly comprehensive and sensitive process so that safety performance assessment program should be prepared in integrated, objective and quantitative manner. Furthermore, the results of assessment should characterize and categorize the actual level of safety performance of specific NPP, integrating all the substantial elements for assessing the safety performance. In consideration of particular regulatory environment in Korea, the integrated safety performance assessment (ISPA) program is being under development for the use in the determination of safety performance grade (SPG) of a NPP. The ISPA program consists of 6 individual assessment programs (4 quantitative and 2 qualitative) which cover the overall safety performance of NPP. Some of the assessment programs which are already implemented are used directly or modified for incorporating risk aspects. The others which are not existing regulatory programs are newly developed. Eventually, all the assessment results from individual assessment programs are produced and integrated to determine the safety performance grade of a specific NPP

  16. Camp Verde Adult Reading Program. Final Performance Report.

    Science.gov (United States)

    Maynard, David A.

    This document begins with a four-page performance report describing how the Camp Verde Adult Reading Program site was relocated to the Community Center Complex, and the Town Council contracted directly with the Friends of the Camp Verde Library to provide for the requirements of the program. The U.S. Department of Education grant allowed the…

  17. Cobra Strikes! High-Performance Car Inspires Students, Markets Program

    Science.gov (United States)

    Jenkins, Bonita

    2008-01-01

    Nestled in the Lower Piedmont region of upstate South Carolina, Piedmont Technical College (PTC) is one of 16 technical colleges in the state. Automotive technology is one of its most popular programs. The program features an instructive, motivating activity that the author describes in this article: building a high-performance car. The Cobra…

  18. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  19. Multi-Language Programming Environments for High Performance Java Computing

    OpenAIRE

    Vladimir Getov; Paul Gray; Sava Mintchev; Vaidy Sunderam

    1999-01-01

    Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI) tool which provides ...

  20. Statistical and Machine Learning Models to Predict Programming Performance

    OpenAIRE

    Bergin, Susan

    2006-01-01

    This thesis details a longitudinal study on factors that influence introductory programming success and on the development of machine learning models to predict incoming student performance. Although numerous studies have developed models to predict programming success, the models struggled to achieve high accuracy in predicting the likely performance of incoming students. Our approach overcomes this by providing a machine learning technique, using a set of three significant...

  1. Effectiveness of Human Research Protection Program Performance Measurements.

    Science.gov (United States)

    Tsan, Min-Fu; Nguyen, Yen

    2017-10-01

    We analyzed human research protection program performance metric data of all Department of Veterans Affairs research facilities obtained from 2010 to 2016. Among a total of 25 performance metrics, 21 (84%) showed improvement, four (16%) remained unchanged, and none deteriorated during the study period. The overall improvement from these 21 performance metrics was 81.1% ± 18.7% (mean ± SD), with a range of 30% to 100%. The four performance metrics that did not show improvement all had initial noncompliance/incidence rates of performance metrics that showed improvement ranged from 0.05% to 60%. However, of the 21 performance metrics that showed improvement, 10 had initial noncompliance/incidence rates of performance measurement is an effective tool in improving the performance of human research protection programs.

  2. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  3. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  4. The Performance Enhancement Group Program: Integrating Sport Psychology and Rehabilitation

    Science.gov (United States)

    Granito, Vincent J.; Hogan, Jeffery B.; Varnum, Lisa K.

    1995-01-01

    In an effort to improve the psychological health of the athlete who has sustained an injury, the Performance Enhancement Group program for injured athletes was created. This paper will offer a model for the Performance Enhancement Group program as a way to: 1) support the athlete, both mentally and physically; 2) deal with the demands of rehabilitation; and 3) facilitate the adjustments the athlete has to make while being out of the competitive arena. The program consists of responsibilities for professionals in sport psychology (ie, assessment/orientation, support, education, individual counseling, and evaluation) and athletic training (ie, organization/administration, recruitment and screening, support, application of techniques, and program compliance). The paper will emphasize that the success of the program is dependent on collaboration between professionals at all levels. PMID:16558357

  5. Materials balance area Custodian Performance Evaluation Program at PNL

    International Nuclear Information System (INIS)

    Dickman, D.A.

    1991-07-01

    The material balance area (MBA) custodian has primary responsibility for control and accountability of nuclear material within an MBA. In this role, the custodian operates as an extension of the facility material control and accountability (MC ampersand A) organization. To effectively meet administrative requirements and protection needs, the custodian must be fully trained in all aspects of MC ampersand A related to the MBA, and custodian performance must be periodically evaluated. DOE Policy requires that each facility provide for a program which assures that each facility provide for a program which assures that personnel performing MC ampersand A functions are (1) trained and/or qualified to perform their duties and responsibilities and (2) knowledgeable of requirements and procedures related to their functions. The MBA Custodian Performance Evaluation Program at PNL uses a variety of assessment techniques to meet this goal, including internal and independent MBA audits, periodic custodian testing, conduct of limited scope performance tests, daily monitoring of MC ampersand A documentation, and reviewing custodian performance during physical inventories. The data collected from these sources is analyzed and incorporated into an annual custodian performance evaluation document, given to each custodian and line management. Development of this program has resulted in significantly improved custodian performance and a marked decrease in finding and observations identified during MBA audits

  6. Accelerating Matlab performance 1001 tips to speed up Matlab programs

    CERN Document Server

    Altman, Yair M

    2014-01-01

    … a very interesting new book on MATLAB® performance … covering basic tools and an appropriate range of specific programming techniques. The book seems to take a whole-system approach … helping readers understand the big picture of how to get better performance.-Michelle Hirsch, Ph.D., Head of MATLAB® Product Management, The MathWorks Inc..

  7. A concept for performance management for Federal science programs

    Science.gov (United States)

    Whalen, Kevin G.

    2017-11-06

    The demonstration of clear linkages between planning, funding, outcomes, and performance management has created unique challenges for U.S. Federal science programs. An approach is presented here that characterizes science program strategic objectives by one of five “activity types”: (1) knowledge discovery, (2) knowledge development and delivery, (3) science support, (4) inventory and monitoring, and (5) knowledge synthesis and assessment. The activity types relate to performance measurement tools for tracking outcomes of research funded under the objective. The result is a multi-time scale, integrated performance measure that tracks individual performance metrics synthetically while also measuring progress toward long-term outcomes. Tracking performance on individual metrics provides explicit linkages to root causes of potentially suboptimal performance and captures both internal and external program drivers, such as customer relations and science support for managers. Functionally connecting strategic planning objectives with performance measurement tools is a practical approach for publicly funded science agencies that links planning, outcomes, and performance management—an enterprise that has created unique challenges for public-sector research and development programs.

  8. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  9. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  10. Human performance for the success of equipment reliability programs

    International Nuclear Information System (INIS)

    Woodcock, J.

    2007-01-01

    Human performance is a critical element of programs directed at equipment reliability. Reliable equipment performance requires broad support from all levels of plant management and throughout all plant departments. Experience at both nuclear power plants and fuel manufacturing plants shows that human performance must be addressed during all phases of program implementation from the beginning through the establishment of a living, on-going process. At the beginning, certain organizational and management actions during the initiation of the program set the stage for successful adoption by station personnel, leading to more rapid benefits. For the long term, equipment reliability is a living process needed throughout the lifetime of a station, a program which must be motivated and measured. Sustained acceptance and participation by the plant personnel is a requirement, and culture is a key ingredient. This paper will provide an overview of key human performance issues to be considered, using the application of the INPO AP-913 Equipment Reliability Guideline as a basis and gives some best practices for training, communicating and implementing programs. The very last part includes ways to tell if the program is effective

  11. Experimental Verification of a Pneumatic Transport System for the Rapid Evacuation of Tunnels, Part II - Test Program

    Science.gov (United States)

    1978-12-01

    This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...

  12. Study on the seismic verification test program on the experimental multi-purpose high-temperature gas cooled reactor core

    International Nuclear Information System (INIS)

    Taketani, K.; Aochi, T.; Yasuno, T.; Ikushima, T.; Shiraki, K.; Honma, T.; Kawamura, N.

    1978-01-01

    The paper describes a program of experimental research necessary for qualitative and quantitative determination of vibration characteristics and aseismic safety on structure of reactor core in the multipurpose high temperature gas-cooled experimental reactor (VHTR Experimental Reactor) by the Japan Atomic Energy Research Institute

  13. 38 CFR 74.1 - What definitions are important for VetBiz Vendor Information Pages (VIP) Verification Program?

    Science.gov (United States)

    2010-07-01

    ... businesses eligible to participate in VA's Veteran-owned Small Business Program. The online database may be..., Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) VETERANS SMALL BUSINESS...'s Office of Small and Disadvantaged Business Utilization. The CVE helps veterans interested in...

  14. Computer-aided performance monitoring program at Diablo Canyon

    International Nuclear Information System (INIS)

    Nelson, T.; Glynn, R. III; Kessler, T.C.

    1992-01-01

    This paper describes the thermal performance monitoring program at Pacific Gas ampersand Electric Company's (PG ampersand E's) Diablo Canyon Nuclear Power Plant. The plant performance monitoring program at Diablo Canyon uses the THERMAC performance monitoring and analysis computer software provided by Expert-EASE Systems. THERMAC is used to collect performance data from the plant process computers, condition that data to adjust for measurement errors and missing data points, evaluate cycle and component-level performance, archive the data for trend analysis and generate performance reports. The current status of the program is that, after a fair amount of open-quotes tuningclose quotes of the basic open-quotes thermal kitclose quotes models provided with the initial THERMAC installation, we have successfully baselined both units to cycle isolation test data from previous reload cycles. Over the course of the past few months, we have accumulated enough data to generate meaningful performance trends and, as a result, have been able to use THERMAC to track a condenser fouling problem that was costing enough megawatts to attract corporate-level attention. Trends from THERMAC clearly related the megawatt loss to a steadily degrading condenser cleanliness factor and verified the subsequent gain in megawatts after the condenser was cleaned. In the future, we expect to rebaseline THERMAC to a beginning of cycle (BOC) data set and to use the program to help track feedwater nozzle fouling

  15. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    Science.gov (United States)

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  16. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    Science.gov (United States)

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  17. Strategies of high-performing paramedic educational programs.

    Science.gov (United States)

    Margolis, Gregg S; Romero, Gabriel A; Fernandez, Antonio R; Studnek, Jonathan R

    2009-01-01

    To identify the specific educational strategies used by paramedic educational programs that have attained consistently high success rates on the National Registry of Emergency Medical Technicians (NREMT) examination. NREMT data from 2003-2007 were analyzed to identify consistently high-performing paramedic educational programs. Representatives from 12 programs that have maintained a 75% first-attempt pass rate for at least four of five years and had more than 20 graduates per year were invited to participate in a focus group. Using the nominal group technique (NGT), participants were asked to answer the following question: "What are specific strategies that lead to a successful paramedic educational program?" All 12 emergency medical services (EMS) educational programs meeting the eligibility requirements participated. After completing the seven-step NGT process, 12 strategies were identified as leading to a successful paramedic educational program: 1) achieve and maintain national accreditation; 2) maintain high-level entry requirements and prerequisites; 3) provide students with a clear idea of expectations for student success; 4) establish a philosophy and foster a culture that values continuous review and improvement; 5) create your own examinations, lesson plans, presentations, and course materials using multiple current references; 6) emphasize emergency medical technician (EMT)-Basic concepts throughout the class; 7) use frequent case-based classroom scenarios; 8) expose students to as many prehospital advanced life support (ALS) patient contacts as possible, preferably where they are in charge; 9) create and administer valid examinations that have been through a review process (such as qualitative analysis); 10) provide students with frequent detailed feedback regarding their performance (such as formal examination reviews); 11) incorporate critical thinking and problem solving into all testing; and 12) deploy predictive testing with analysis prior to

  18. High Performance Object-Oriented Scientific Programming in Fortran 90

    Science.gov (United States)

    Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.

    1997-01-01

    We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.

  19. Heat exchanger performance analysis programs for the personal computer

    International Nuclear Information System (INIS)

    Putman, R.E.

    1992-01-01

    Numerous utility industry heat exchange calculations are repetitive and thus lend themselves to being performed on a Personal Computer. These programs may be regarded as engineering tools which, when put together, can form a Toolbox. However, the practicing Results Engineer in the utility industry desires not only programs that are robust as well as easy to use but can also be used both on desktop and laptop PC's. The latter also offer the opportunity to take the computer into the plant or control room, and use it there to process test or operating data right on the spot. Most programs evolve through the needs which arise in the course of day-to-day work. This paper describes several of the more useful programs of this type and outlines some of the guidelines to be followed when designing personal computer programs for use by the practicing Results Engineer

  20. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. Preparation of a program for the independent verification of the brachytherapy planning systems calculations; Confeccion de un programa para la verificacion independiente de los calculos de los sistemas de planificacion en braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-07-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  2. Chemistry technician performance evaluation program Palo Verde Nuclear Generating Station

    International Nuclear Information System (INIS)

    Shawver, J.M.

    1992-01-01

    The Arizona Nuclear Power Project (ANPP), a three-reactor site located 50 miles west of Phoenix, Arizona, has developed and implemented a program for evaluating individual chemistry technician analytical performance on a routine basis. About 45 chemistry technicians are employed at the site, 15 at each operating unit. The technicians routinely perform trace level analyses for impurities of concern to PWRs. Each month a set of blind samples is provided by an outside vendor. The blind samples contain 16 parameters which are matrixed to approximate the PWR's primary and secondary cycles. Nine technicians receive the samples, three from each operating unit, and perform the required analyses. Acceptance criteria for successful performance on the blind parameters is based on the values found in the Institute of Nuclear Power Operations (INPO) Document 83-016, Revision 2, August 1989, Chemistry Quality Control Program. The goal of the program is to have each technician demonstrate acceptable performance on each of 16 analytical parameters. On completion of each monthly set, a summary report of all of the analytical results for the sample set is prepared. From the summary report, analytical bias can be detected, technician performance is documented, and overall laboratory performance can be evaluated. The program has been very successful at satisfying the INPO requirement that the analytical performance of each individual technician should be checked on at least a six-month frequency for all important parameters measured. This paper describes the program as implemented at the Palo Verde Nuclear Generating Station and provides a summary report and trend and bias graphs for illustrative purposes

  3. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  4. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  5. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  6. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  7. Need of patient-specific quality assurance and pre-treatment verification program for special plans in radiotherapy

    International Nuclear Information System (INIS)

    Ravichandran, Ramamoorthy; Bhasi, Saju; Binukumar, J.P.; Davis, C.A.

    2011-01-01

    Accuracy in planned radiation dose delivery in cancer treatments becomes necessary in the advent of complex treatment delivery options with newer technology using medical linear accelerators, which makes patient management very crucial. Treatment outcome in an individual patient therefore depends on the professional involvement of staff and execution accuracy of planned procedure. Therefore, this article has addressed an important problem. International Atomic Energy Agency (IAEA) and International Commission on Radiological Protection (ICRP) reported mis-administrations of radiation dose, the nature of their occurrence and complexity of situations. Lack of adequate quality assurance (QA) program or failure in their routine applications, complacency in attention, lack of knowledge, overconfidence, pressures of time, lack of resources and failures in communication are some of the general human causes of errors. A recent report enumerated misadministration of radiation doses under the heading 'harming instead of healing' delivery of wrong doses in small field treatment plans with stereotactic equipment' was mostly highlighted

  8. The Innovative Design and Prototype Verification of Wheelchair with One Degree of Freedom to Perform Lifting and Standing Functions

    Science.gov (United States)

    Hsieh, Long-Chang; Chen, Tzu-Hsia

    2017-12-01

    Traditionally, the mechanism of wheelchair with lifting and standing functions has 2 degrees of freedom, and used 2 power sources to perform these 2 motion function. The purpose of this paper is to invent new wheelchair with 1 degree of freedom to perform these 2 motion functions. Hence, we can use only 1 power source to drive the mechanism to achieve lifting and standing motion functions. The new design has the advantages of simple operation, more stability, and more safety. For traditional standing wheelchair, its’ centre of gravity moves forward when standing up and it needs 2 auxiliary wheels to prevent dumping. In this paper, by using the checklist method of Osborn, the wheelchair with 1 DOF is invented to perform lifting and standing functions. The centre of gravity of this new wheelchair after standing up still located between the front and rear wheels, no auxiliary wheels needed. Finally, the prototype is manufactured to verify the theoretical results.

  9. Prediction and experimental verification of performance of box type solar cooker. Part II: Cooking vessel with depressed lid

    International Nuclear Information System (INIS)

    Reddy, Avala Raji; Rao, A.V. Narasimha

    2008-01-01

    Our previous article (Part I) discussed the theoretical and experimental study of the performance boost obtained by a cooking vessel with central cylindrical cavity on lugs when compared to that of a conventional cylindrical vessel on floor/lugs. This article compares the performance of the cooking vessel with depressed lid on lugs with that of the conventional vessel on lugs. A mathematical model is presented to understand the heat flow process to the cooking vessel and, thereby, to the food material. It is found from the experiments that the cooking vessel with depressed lid results in higher temperature of the thermic fluid loaded in the cooking vessel compared to that of the thermic fluid kept in the conventional vessel when both are placed on lugs. Similar results were obtained by modeling the process mathematically. The average improvement of performance of the vessel with depressed lid is found to be 8.4% better than the conventional cylindrical vessel

  10. Development of a computer program for the simulation of ice-bank system operation, part II: Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grozdek, Marino; Halasz, Boris; Curko, Tonko [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecture, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2010-12-15

    In order to verify the mathematical model of an ice bank system developed for the purpose of predicting the system performance, experimental measurements on the ice bank system were performed. Static, indirect, cool thermal storage system, with an external ice-on-coil building/melting was considered. Cooling energy stored in the form of ice by night is used for the rapid cooling of milk after the process of pasteurization by day. The ice bank system was tested under real operating conditions to determine parameters such as the time-varying heat load imposed by the consumer, refrigeration unit load, storage capacity, supply water temperature to the load and to find charging and discharging characteristics of the storage. Experimentally obtained results were then compared to the computed ones. It was found that the calculated and experimentally obtained results are in good agreement as long as there is ice present in the silo. (author)

  11. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    International Nuclear Information System (INIS)

    Eyler, L.L.; Trent, D.S.; Budden, M.J.

    1983-09-01

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs

  12. Immediate Effects of Different Trunk Exercise Programs on Jump Performance.

    Science.gov (United States)

    Imai, A; Kaneoka, K; Okubo, Y; Shiraki, H

    2016-03-01

    The aim of this study was to investigate the immediate effects of trunk stabilization exercise (SE) and conventional trunk exercise (CE) programs on jump performance. 13 adolescent male soccer players performed 2 kinds of jump testing before and immediate after 3 experimental conditions: SE, CE, and non-exercise (NE). The SE program consisted of the elbow-toe, hand-knee, and back bridge, and the CE program consisted of the sit-up, sit-up with trunk rotation and back extension. Testing of a countermovement jump (CMJ) and rebound jump (RJ) were performed to assess jump performance. Jump height of the CMJ and RJ-index, contact time, and jump height of the RJ were analyzed. The RJ index was improved significantly only after SE (p=0.017). However, contact time and jump height did not improve significantly in the SE condition. Moreover, no significant interaction or main effects of time or group were observed in the CMJ. Consequently, this study showed the different immediate effect on the RJ between the SE and CE, and suggested the possibility that the SE used in this study is useful as a warm-up program to improve the explosive movements. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Counselor Competence, Performance Assessment, and Program Evaluation: Using Psychometric Instruments

    Science.gov (United States)

    Tate, Kevin A.; Bloom, Margaret L.; Tassara, Marcel H.; Caperton, William

    2014-01-01

    Psychometric instruments have been underutilized by counselor educators in performance assessment and program evaluation efforts. As such, we conducted a review of the literature that revealed 41 instruments fit for such efforts. We described and critiqued these instruments along four dimensions--"Target Domain," "Format,"…

  14. 22 CFR 226.51 - Monitoring and reporting program performance.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Monitoring and reporting program performance. 226.51 Section 226.51 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF... more frequently than quarterly or, less frequently than annually. Annual reports shall be due 90...

  15. Performance analysis and experimental verification of mid-range wireless energy transfer through non-resonant magnetic coupling

    DEFF Research Database (Denmark)

    Peng, Liang; Wang, Jingyu; Zhejiang University, Hangzhou, China, L.

    2011-01-01

    In this paper, the efficiency analysis of a mid-range wireless energy transfer system is performed through non-resonant magnetic coupling. It is shown that the self-resistance of the coils and the mutual inductance are critical in achieving a high efficiency, which is indicated by our theoretical...

  16. Parametric Study for MOV Performance Improvement Using PPM Program

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungho; Seon, Juhyoung; Han, Bongsub [SOOSAN INDUSTRIES, Seoul (Korea, Republic of)

    2016-10-15

    Nuclear power plants mainly use Air Operated Valve(hereinafter referred to as AOV) and Motor Operator Valve(hereinafter referred to as MOV) for protecting system, blocking and controlling flow. Field test(static, dynamic test) results and performance prediction program are used to evaluate if MOV currently installed on nuclear power plants has the operational performance. The improvement of operating performance for Flexible Gate valve was confirmed on changing input variables of performance program(PPM). here are several methods through reviewing design basis, changes operating procedures and maintenance work of stem(or packing, etc.) to improve operating performance of MOV generally installed in the nuclear power plants. This study verified the changes of the MOV operating performance through the improvement of stem and hydraulic parts(seat, guide etc.). Especially, MOV operating performance was much greater improved when the Disk Seat Angle was decreasing. Generally, improvement work to minimize friction of seat, disk and guide is limited and dynamic diagnostic testing has to be performed with change in valve factor for improvement of hydraulic parts.

  17. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  18. Prediction and experimental verification of performance of box type solar cooker - Part I. Cooking vessel with central cylindrical cavity

    International Nuclear Information System (INIS)

    Reddy, Avala Raji; Rao, A.V. Narasimha

    2007-01-01

    The performance of conventional box type solar cookers can be improved by better designs of cooking vessels with proper understanding of the heat flow to the material to be cooked. An attempt has been made in this article to arrive at a mathematical model to understand the heat flow process to the cooking vessel and thereby to the food material. The mathematical model considers a double glazed hot box type solar cooker loaded with two different types of vessels, kept either on the floor of the cooker or on lugs. The performance of the cooking vessel with a central cylindrical cavity is compared with that of a conventional cylindrical cooking vessel. It is found from the experiments and modeling that the cooking vessel with a central cylindrical cavity on lugs results in a higher temperature of the thermic fluid than that of a conventional vessel on the floor or on lugs. The average improvement of performance of the vessel with a central cylindrical cavity kept on lugs is found to be 5.9% and 2.4% more than that of a conventional cylindrical vessel on the floor and on lugs, respectively

  19. Mathematical Verification for Transmission Performance of Centralized Lightwave WDM-RoF-PON with Quintuple Services Integrated in Each Wavelength Channel

    Directory of Open Access Journals (Sweden)

    Shuai Chen

    2015-01-01

    Full Text Available Wavelength-division-multiplexing passive-optical-network (WDM-PON has been recognized as a promising solution of the “last mile” access as well as multibroadband data services access for end users, and WDM-RoF-PON, which employs radio-over-fiber (RoF technique in WDM-PON, is even a more attractive approach for future broadband fiber and wireless access for its strong availability of centralized multiservices transmission operation and its transparency for bandwidth and signal modulation formats. As for multiservices development in WDM-RoF-PON, various system designs have been reported and verified via simulation or experiment till now, and the scheme with multiservices transmitted in each single wavelength channel is believed as the one that has the highest bandwidth efficiency; however, the corresponding mathematical verification is still hard to be found in state-of-the-art literature. In this paper, system design and data transmission performance of a quintuple services integrated WDM-RoF-PON which jointly employs carrier multiplexing and orthogonal modulation techniques, have been theoretically analyzed and verified in detail; moreover, the system design has been duplicated and verified experimentally and the theory system of such WDM-RoF-PON scheme has thus been formed.

  20. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  1. Verification and validation of decision support software: Expert Choice{trademark} and PCM{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Q.H.; Martin, J.D.

    1994-11-04

    This report documents the verification and validation of two decision support programs: EXPERT CHOICE{trademark} and PCM{trademark}. Both programs use the Analytic Hierarchy Process (AHP) -- or pairwise comparison technique -- developed by Dr. Thomas L. Saaty. In order to provide an independent method for the validating the two programs, the pairwise comparison algorithm was developed for a standard mathematical program. A standard data set -- selecting a car to purchase -- was used with each of the three programs for validation. The results show that both commercial programs performed correctly.

  2. Proposal of performance indicators/model for Operational Readiness Verification (ORV) at restart after a planned shutdown

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Nygren, Magnus

    2005-12-01

    The objectives of the study reported here were to propose a model that can be used in the analysis of possible future ORV-related events and to outline a set of performance indicators that can be used by the inspectorate to assess a utility's level of readiness if an ORV-event should take place. Together the two objectives serve to improve the inspectorate's ability to ensure that the utilities maintain an adequate capability to respond. The background for the current study is the nine ORV events that occurred in Sweden between 1995- 1998, as well as the findings of a previous study of safety during outage and restart of nuclear power plants project. This study found that the three levels or types of tests that occur in ORV were used according to need rather than according to a predefined arrangement or procedure, and that tasks were adapted relative to the different types of embedding and the degree of correspondence between nominal and actual ORV. The organisation's coping with the complexity of ORV was discussed by the relation between expectations and surprises, how planning was used as control, attention to details, and the practices of shift changes. It is a truism that accidents are analysed and interpreted relative to a commonly accepted understanding of their nature. This understanding is, however, relative rather than absolute, and has changed significantly during the last decade. In the 1990s, accidents were analysed step by step, and explanations and recommendations therefore emphasised specific rather than generic solutions. The present study illustrates this by going through the responses to the nine ORV events. Following that, the nine events are analysed anew using a contemporary understanding of accidents (a systemic model), which emphasises that incidents more often arise from context induced performance variability than from failures of people. The alternative interpretation provided by a systemic model is illustrated by a detailed analysis of

  3. The Waste Isolation Pilot Plant Performance Assessment Program

    International Nuclear Information System (INIS)

    Myers, J.; Coons, W.E.; Eastmond, R.; Morse, J.; Chakrabarti, S.; Zurkoff, J.; Colton, I.D.; Banz, I.

    1986-01-01

    The Waste Isolation Pilot Plant (WIPP) Performance Assessment Program involves a comprehensive analysis of the WIPP project with respect to the recently finalized Environmental Protection Agency regulations regarding the long-term geologic isolation of radioactive wastes. The performance assessment brings together the results of site characterization, underground experimental, and environmental studies into a rigorous determination of the performance of WIPP as a disposal system for transuranic radioactive waste. The Program consists of scenario development, geochemical, hydrologic, and thermomechanical support analyses and will address the specific containment and individual protection requirements specified in 40 CFR 191 sub-part B. Calculated releases from these interrelated analyses will be reported as an overall probability distribution of cumulative release resulting from all processes and events occurring over the 10,000 year post-closure period. In addition, results will include any doses to the public resulting from natural processes occurring over the 1,000 year post-closure period. The overall plan for the WIPP Performance Assessment Program is presented along with approaches to issues specific to the WIPP project

  4. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  5. Testing and Performance Verification of a High Bypass Ratio Turbofan Rotor in an Internal Flow Component Test Facility

    Science.gov (United States)

    VanZante, Dale E.; Podboy, Gary G.; Miller, Christopher J.; Thorp, Scott A.

    2009-01-01

    A 1/5 scale model rotor representative of a current technology, high bypass ratio, turbofan engine was installed and tested in the W8 single-stage, high-speed, compressor test facility at NASA Glenn Research Center (GRC). The same fan rotor was tested previously in the GRC 9x15 Low Speed Wind Tunnel as a fan module consisting of the rotor and outlet guide vanes mounted in a flight-like nacelle. The W8 test verified that the aerodynamic performance and detailed flow field of the rotor as installed in W8 were representative of the wind tunnel fan module installation. Modifications to W8 were necessary to ensure that this internal flow facility would have a flow field at the test package that is representative of flow conditions in the wind tunnel installation. Inlet flow conditioning was designed and installed in W8 to lower the fan face turbulence intensity to less than 1.0 percent in order to better match the wind tunnel operating environment. Also, inlet bleed was added to thin the casing boundary layer to be more representative of a flight nacelle boundary layer. On the 100 percent speed operating line the fan pressure rise and mass flow rate agreed with the wind tunnel data to within 1 percent. Detailed hot film surveys of the inlet flow, inlet boundary layer and fan exit flow were compared to results from the wind tunnel. The effect of inlet casing boundary layer thickness on fan performance was quantified. Challenges and lessons learned from testing this high flow, low static pressure rise fan in an internal flow facility are discussed.

  6. Performance demonstration program plan for analysis of simulated headspace gases

    International Nuclear Information System (INIS)

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP

  7. Material balance area custodian performance evaluation program at PNL

    International Nuclear Information System (INIS)

    Dickman, D.A.

    1991-01-01

    This paper reports that the material balance area (MBA) custodian has primary responsibility for control and accountability of nuclear material within an MBA. In this role, the custodian operates as an extension of the facility material control and accountability (MC and A) organization. To effectively meet administrative requirements and protection needs, the custodian must be fully trained in all aspects of MC and A related to the MBA, and custodian performance must be periodically evaluated. U.S. Department of Energy (DOE) Policy requires that each facility provide for a program which ensures that personnel performing MC and A functions are trained and/or qualified to perform their duties and responsibilities and knowledgeable of requirements and procedures related to their functions. the MBA Custodian Performance Evaluation Program at Pacific Northwest Laboratory (PNL) uses a variety of assessment techniques to meet this goal, including internal and independent MBA audits, periodic custodian testing, limited scope performance tests, daily monitoring of MC and A documentation, and reviewing custodian performance during physical inventories

  8. How to Use Linear Programming for Information System Performances Optimization

    Directory of Open Access Journals (Sweden)

    Hell Marko

    2014-09-01

    Full Text Available Background: Organisations nowadays operate in a very dynamic environment, and therefore, their ability of continuously adjusting the strategic plan to the new conditions is a must for achieving their strategic objectives. BSC is a well-known methodology for measuring performances enabling organizations to learn how well they are doing. In this paper, “BSC for IS” will be proposed in order to measure the IS impact on the achievement of organizations’ business goals. Objectives: The objective of this paper is to present the original procedure which is used to enhance the BSC methodology in planning the optimal targets of IS performances value in order to maximize the organization's effectiveness. Methods/Approach: The method used in this paper is the quantitative methodology - linear programming. In the case study, linear programming is used for optimizing organization’s strategic performance. Results: Results are shown on the example of a case study national park. An optimal performance value for the strategic objective has been calculated, as well as an optimal performance value for each DO (derived objective. Results are calculated in Excel, using Solver Add-in. Conclusions: The presentation of methodology through the case study of a national park shows that this methodology, though it requires a high level of formalisation, provides a very transparent performance calculation.

  9. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  10. Overcoming urban GPS navigation challenges through the use of MEMS inertial sensors and proper verification of navigation system performance

    Science.gov (United States)

    Vinande, Eric T.

    This research proposes several means to overcome challenges in the urban environment to ground vehicle global positioning system (GPS) receiver navigation performance through the integration of external sensor information. The effects of narrowband radio frequency interference and signal attenuation, both common in the urban environment, are examined with respect to receiver signal tracking processes. Low-cost microelectromechanical systems (MEMS) inertial sensors, suitable for the consumer market, are the focus of receiver augmentation as they provide an independent measure of motion and are independent of vehicle systems. A method for estimating the mounting angles of an inertial sensor cluster utilizing typical urban driving maneuvers is developed and is able to provide angular measurements within two degrees of truth. The integration of GPS and MEMS inertial sensors is developed utilizing a full state navigation filter. Appropriate statistical methods are developed to evaluate the urban environment navigation improvement due to the addition of MEMS inertial sensors. A receiver evaluation metric that combines accuracy, availability, and maximum error measurements is presented and evaluated over several drive tests. Following a description of proper drive test techniques, record and playback systems are evaluated as the optimal way of testing multiple receivers and/or integrated navigation systems in the urban environment as they simplify vehicle testing requirements.

  11. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Structural Performance Optimization and Verification of an Improved Thin-Walled Storage Tank for a Pico-Satellite

    Directory of Open Access Journals (Sweden)

    Lai Teng

    2017-11-01

    Full Text Available This paper presents an improved mesh storage tank structure obtained using 3D metal printing. The storage tank structure is optimized using a multi-objective uniform design method. Each parameter influencing the storage tank is considered as the optimization factor, and the compression stress ( σ , volume utilization ratio ( v , and weight ( m , are considered as the optimization objectives. Regression equations were established between the optimization factors and targets, the orders of the six factors affecting three target values are analyzed, and the relative deviations between the regression equation and calculation results for σ , v , and m were 9.72%, 4.15%, and 2.94%, respectively. The optimization results showed that the regression equations can predict the structure performance of the improved storage tank, and the values of the influence factors obtained through the optimization are effective. In addition, the compression stress was improved by 24.98%, the volume utilization ratio was increased by 26.86%, and the weight was reduced by 26.83%. The optimized storage tank was developed through 3D metal printing, and the compressive stress was improved by 58.71%, the volume utilization ratio was increased by 24.52%, and the weight was reduced by 11.67%.

  13. Verification of the code ATHLET by post-test analysis of two experiments performed at the CCTF integral test facility

    International Nuclear Information System (INIS)

    Krepper, E.; Schaefer, F.

    2001-03-01

    In the framework of the external validation of the thermohydraulic code ATHLET Mod 1.2 Cycle C, which has been developed by the GRS, post test analyses of two experiments were done, which were performed at the japanese test facility CCTF. The test facility CCTF is a 1:25 volume-scaled model of a 1000 MW pressurized water reactor. The tests simulate a double end break in the cold leg of the PWR with ECC injection into the cold leg and with combined ECC injection into the hot and cold legs. The evaluation of the calculated results shows, that the main phenomena can be calculated in a good agreement with the experiment. Especially the behaviour of the quench front and the core cooling are calculated very well. Applying a two-channel representation of the reactor model the radial behaviour of the quench front could be reproduced. Deviations between calculations and experiment can be observed simulating the emergency injection in the beginning of the transient. Very high condensation rates were calculated and the pressure decrease in this phase of the transient is overestimated. Besides that, the pressurization due to evaporation in the refill phase is underestimated by ATHLET. (orig.) [de

  14. Performance verification and comparison of TianLong automatic hypersensitive hepatitis B virus DNA quantification system with Roche CAP/CTM system.

    Science.gov (United States)

    Li, Ming; Chen, Lin; Liu, Li-Ming; Li, Yong-Li; Li, Bo-An; Li, Bo; Mao, Yuan-Li; Xia, Li-Fang; Wang, Tong; Liu, Ya-Nan; Li, Zheng; Guo, Tong-Sheng

    2017-10-07

    To investigate and compare the analytical and clinical performance of TianLong automatic hypersensitive hepatitis B virus (HBV) DNA quantification system and Roche CAP/CTM system. Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the TianLong automatic hypersensitive HBV DNA quantification system (TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. The detection limit of the TL system was 10 IU/mL, and its limit of quantification was 30 IU/mL. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log 10 IU/mL, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation (CV) of the same sample was less than 5% for 10 2 -10 6 IU/mL; and for 30-10 8 IU/mL, the linear correlation coefficient r 2 = 0.99. The TL system detected HBV DNA (A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results (15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed ( P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r 2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference

  15. A program for performing angular integrations for transition operators

    International Nuclear Information System (INIS)

    Froese Fischer, C.; Godefroid, M.R.; Hibbert, A.

    1991-01-01

    The MCHF-MLTPOL program performs the angular integrations necessary for expressing the matrix elements of transition operators, E1, E2, ..., or M1, M2, ..., as linear combinations of radial integrals. All matrix elements for transitions between two lists of configuration states will be evaluated. A limited amount of non-orthogonality is allowed between orbitals of the initial and final state. (orig.)

  16. MHA admission criteria and program performance: do they predict career performance?

    Science.gov (United States)

    Porter, J; Galfano, V J

    1987-01-01

    The purpose of this study was to determine to what extent admission criteria predict graduate school and career performance. The study also analyzed which objective and subjective criteria served as the best predictors. MHA graduates of the University of Minnesota from 1974 to 1977 were surveyed to assess career performance. Student files served as the data base on admission criteria and program performance. Career performance was measured by four variables: total compensation, satisfaction, fiscal responsibility, and level of authority. High levels of MHA program performance were associated with women who had high undergraduate GPAs from highly selective undergraduate colleges, were undergraduate business majors, and participated in extracurricular activities. High levels of compensation were associated with relatively low undergraduate GPAs, high levels of participation in undergraduate extracurricular activities, and being single at admission to graduate school. Admission to MHA programs should be based upon both objective and subjective criteria. Emphasis should be placed upon the selection process for MHA students since admission criteria are shown to explain 30 percent of the variability in graduate program performance, and as much as 65 percent of the variance in level of position authority.

  17. Tuberculosis control program in the municipal context: performance evaluation

    Directory of Open Access Journals (Sweden)

    Tiemi Arakawa

    Full Text Available ABSTRACT OBJECTIVE The objective of this study is to evaluate the performance of the Tuberculosis Control Program in municipalities of the State of São Paulo. METHODS This is a program evaluation research, with ecological design, which uses three non-hierarchical groups of the municipalities of the State of São Paulo according to their performance in relation to operational indicators. We have selected 195 municipalities with at least five new cases of tuberculosis notified in the Notification System of the State of São Paulo and with 20,000 inhabitants or more in 2010. The multiple correspondence analysis was used to identify the association between the groups of different performances, the epidemiological and demographic characteristics, and the characteristics of the health systems of the municipalities. RESULTS The group with the worst performance showed the highest rates of abandonment (average [avg] = 10.4, standard deviation [sd] = 9.4 and the lowest rates of supervision of Directly Observed Treatment (avg = 6.1, sd = 12.9, and it was associated with low incidence of tuberculosis, high tuberculosis and HIV, small population, high coverage of the Family Health Strategy/Program of Community Health Agents, and being located on the countryside. The group with the best performance presented the highest cure rate (avg = 83.7, sd = 10.5 and the highest rate of cases in Directly Observed Treatment (avg = 83.0, sd = 12.7; the group of regular performance showed regular results for outcome (avg cure = 79.8, sd = 13.2; abandonment avg = 9.5, sd = 8.3 and supervision of the Directly Observed Treatment (avg = 42.8, sd = 18.8. Large population, low coverage of the Family Health Strategy/Program of Community Health Agents, high incidence of tuberculosis and AIDS, and being located on the coast and in metropolitan areas were associated with these groups. CONCLUSIONS The findings highlight the importance of the Directly Observed Treatment in relation

  18. Test Program for the Performance Analysis of DNS64 Servers

    Directory of Open Access Journals (Sweden)

    Gábor Lencse

    2015-09-01

    Full Text Available In our earlier research papers, bash shell scripts using the host Linux command were applied for testing the performance and stability of different DNS64 server imple­mentations. Because of their inefficiency, a small multi-threaded C/C++ program (named dns64perf was written which can directly send DNS AAAA record queries. After the introduction to the essential theoretical background about the structure of DNS messages and TCP/IP socket interface programming, the design decisions and implementation details of our DNS64 performance test program are disclosed. The efficiency of dns64perf is compared to that of the old method using bash shell scripts. The result is convincing: dns64perf can send at least 95 times more DNS AAAA record queries per second. The source code of dns64perf is published under the GNU GPLv3 license to support the work of other researchers in the field of testing the performance of DNS64 servers.

  19. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  20. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    International Nuclear Information System (INIS)

    Weaver, Phyllis C.

    2012-01-01

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site's conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse

  1. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  2. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  3. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  4. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  5. Maximizing Energy Savings Reliability in BC Hydro Industrial Demand-side Management Programs: An Assessment of Performance Incentive Models

    Science.gov (United States)

    Gosman, Nathaniel

    of alternative performance incentive program models to manage DSM risk in BC. Three performance incentive program models were assessed and compared to BC Hydro's current large industrial DSM incentive program, Power Smart Partners -- Transmission Project Incentives, itself a performance incentive-based program. Together, the selected program models represent a continuum of program design and implementation in terms of the schedule and level of incentives provided, the duration and rigour of measurement and verification (M&V), energy efficiency measures targeted and involvement of the private sector. A multi criteria assessment framework was developed to rank the capacity of each program model to manage BC large industrial DSM risk factors. DSM risk management rankings were then compared to program costeffectiveness, targeted energy savings potential in BC and survey results from BC industrial firms on the program models. The findings indicate that the reliability of DSM energy savings in the BC large industrial sector can be maximized through performance incentive program models that: (1) offer incentives jointly for capital and low-cost operations and maintenance (O&M) measures, (2) allow flexible lead times for project development, (3) utilize rigorous M&V methods capable of measuring variable load, process-based energy savings, (4) use moderate contract lengths that align with effective measure life, and (5) integrate energy management software tools capable of providing energy performance feedback to customers to maximize the persistence of energy savings. While this study focuses exclusively on the BC large industrial sector, the findings of this research have applicability to all energy utilities serving large, energy intensive industrial sectors.

  6. Independent verification of the delivered dose in High-Dose Rate (HDR) brachytherapy

    International Nuclear Information System (INIS)

    Portillo, P.; Feld, D.; Kessler, J.

    2009-01-01

    An important aspect of a Quality Assurance program in Clinical Dosimetry is an independent verification of the dosimetric calculation done by the Treatment Planning System for each radiation treatment. The present paper is aimed at creating a spreadsheet for the verification of the dose recorded at a point of an implant with radioactive sources and HDR in gynecological injuries. An 192 Ir source automatic differed loading equipment, GammaMedplus model, Varian Medical System with HDR installed at the Angel H. Roffo Oncology Institute has been used. The planning system implemented for getting the dose distribution is the BraquiVision. The sources coordinates as well as those of the calculation point (Rectum) are entered into the Excel-devised verification program by assuming the existence of a point source in each one of the applicators' positions. Such calculation point has been selected as the rectum is an organ at risk, therefore determining the treatment planning. The dose verification is performed at points standing at a sources distance having at least twice the active length of such sources, so they may be regarded as point sources. Most of the sources used in HDR brachytherapy with 192 Ir have a 5 mm active length for all equipment brands. Consequently, the dose verification distance must be at least of 10 mm. (author)

  7. 38 CFR 74.2 - What are the eligibility requirements a concern must meet for VetBiz VIP Verification Program?

    Science.gov (United States)

    2010-07-01

    ... loans or other Federally assisted financing, is eligible for VetBiz VIP Verification. (e) U.S. Small Business Administration (SBA) Protest Decisions. Any firm registered in the VetBiz VIP database that is found to be ineligible due to an SBA protest decision or other negative finding will be immediately...

  8. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  9. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  10. Monte Carlo simulations to replace film dosimetry in IMRT verification

    International Nuclear Information System (INIS)

    Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)

  11. Design verification for reactor head replacement

    International Nuclear Information System (INIS)

    Dwivedy, K.K.; Whitt, M.S.; Lee, R.

    2005-01-01

    This paper outlines the challenges of design verification for reactor head replacement for PWR plants and the program for qualification from the prospective of the utility design engineering group. This paper is based on the experience with the design confirmation of four reactor head replacements for two plants, and their interfacing components, parts, appurtenances, and support structures. The reactor head replacement falls under the jurisdiction of the applicable edition of the ASME Section XI code, with particular reference to repair/replacement activities. Under any repair/replacement activities, demands may be encountered in the development of program and plan for replacement due to the vintage of the original design/construction Code and the design reports governing the component qualifications. Because of the obvious importance of the reactor vessel, these challenges take on an added significance. Additional complexities are introduced to the project, when the replacement components are fabricated by vendors different from the original vendor. Specific attention is needed with respect to compatibility with the original design and construction of the part and interfacing components. The program for reactor head replacement requires evaluation of welding procedures, applicable examination, test, and acceptance criteria for material, welds, and the components. Also, the design needs to take into consideration the life of the replacement components with respect to the extended period of operation of the plant after license renewal and other plant improvements. Thus, the verification of acceptability of reactor head replacement provides challenges for development and maintenance of a program and plan, design specification, design report, manufacturer's data report and material certification, and a report of reconciliation. The technical need may also be compounded by other challenges such as widely scattered global activities and organizational barriers, which

  12. Predicting introductory programming performance: A multi-institutional multivariate study

    Science.gov (United States)

    Bergin, Susan; Reilly, Ronan

    2006-12-01

    A model for predicting student performance on introductory programming modules is presented. The model uses attributes identified in a study carried out at four third-level institutions in the Republic of Ireland. Four instruments were used to collect the data and over 25 attributes were examined. A data reduction technique was applied and a logistic regression model using 10-fold stratified cross validation was developed. The model used three attributes: Leaving Certificate Mathematics result (final mathematics examination at second level), number of hours playing computer games while taking the module and programming self-esteem. Prediction success was significant with 80% of students correctly classified. The model also works well on a per-institution level. A discussion on the implications of the model is provided and future work is outlined.

  13. Solar diffusers in Earth observation instruments with an illumination angle of up to 70°: design and verification of performance in BRDF

    NARCIS (Netherlands)

    Gür, B.; Bol, H.; Xu, P.; Li, B.

    2015-01-01

    The present paper describes the challenging diffuser design and verification activities of TNO under contract of a customer for an earth observation instrument with observation conditions that require feasible BRDF under large angles of incidence of up to 70° with respect to the surface normal. Not

  14. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2006-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  15. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2007-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  16. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  17. The US Acid Rain Program: design, performance, and assessment

    DEFF Research Database (Denmark)

    Svendsen, Gert Tinggaard

    1998-01-01

    permit prices. Property rights to permits have been well-defined, strictly enforced, and sources have been allowed to trade freely without administrative approval of each trade. Ignoring source location in this way has kept transaction costs at a minimum. In conclusion, the policy design of the ARP......The US Acid Rain Program (ARP) from 1990 allows 1,000 major electric utilities all over the US to trade SO2 permits. Historical emission rights have been grandfathered and the target level is 50% SO2 reduction. Market performance has been successfull with much trade activity and unexpectedly low...

  18. Enhancing Functional Performance using Sensorimotor Adaptability Training Programs

    Science.gov (United States)

    Bloomberg, J. J.; Mulavara, A. P.; Peters, B. T.; Brady, R.; Audas, C.; Ruttley, T. M.; Cohen, H. S.

    2009-01-01

    During the acute phase of adaptation to novel gravitational environments, sensorimotor disturbances have the potential to disrupt the ability of astronauts to perform functional tasks. The goal of this project is to develop a sensorimotor adaptability (SA) training program designed to facilitate recovery of functional capabilities when astronauts transition to different gravitational environments. The project conducted a series of studies that investigated the efficacy of treadmill training combined with a variety of sensory challenges designed to increase adaptability including alterations in visual flow, body loading, and support surface stability.

  19. Performance measures in the earth observations commercialization applications program

    Science.gov (United States)

    Macauley, Molly K.

    1996-03-01

    Performance measures in the Earth Observations Commercialization Application Program (EOCAP) are key to its success and include net profitability; enhancements to industry productivity through generic innovations in industry practices, standards, and protocols; and documented contributions to public policy governing the newly developing remote sensing industry. Because EOCAP requires company co-funding, both parties to the agreement (the government and the corporate partner) have incentives to pursue these goals. Further strengthening progress towards these goals are requirements for business plans in the company's EOCAP proposal, detailed scrutiny given these plans during proposal selection, and regularly documented progress reports during project implementation.

  20. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  1. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  2. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  3. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  4. Behavioral Health and Performance Operations During the Space Shuttle Program

    Science.gov (United States)

    Beven, G.; Holland, A.; Moomaw, R.; Sipes, W.; Vander Ark, S.

    2011-01-01

    Prior to the Columbia STS 107 disaster in 2003, the Johnson Space Center s Behavioral Health and Performance Group (BHP) became involved in Space Shuttle Operations on an as needed basis, occasionally acting as a consultant and primarily addressing crew-crew personality conflicts. The BHP group also assisted with astronaut selection at every selection cycle beginning in 1991. Following STS 107, an event that spawned an increased need of behavioral health support to STS crew members and their dependents, BHP services to the Space Shuttle Program were enhanced beginning with the STS 114 Return to Flight mission in 2005. These services included the presence of BHP personnel at STS launches and landings for contingency support, a BHP briefing to the entire STS crew at L-11 months, a private preflight meeting with the STS Commander at L-9 months, and the presence of a BHP consultant at the L-1.5 month Family Support Office briefing to crew and family members. The later development of an annual behavioral health assessment of all active astronauts also augmented BHP s Space Shuttle Program specific services, allowing for private meetings with all STS crew members before and after each mission. The components of each facet of these BHP Space Shuttle Program support services will be presented, along with valuable lessons learned, and with recommendations for BHP involvement in future short duration space missions

  5. Performance Demonstration Program Plan for Nondestructive Assay for the TRU Waste Characterization Program. Revision 1

    International Nuclear Information System (INIS)

    1997-01-01

    The Performance Demonstration Program (PDP) for Nondestructive Assay (NDA) consists of a series of tests conducted on a regular frequency to evaluate the capability for nondestructive assay of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each test is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed with TRU waste characterization systems. Measurement facility performance will be demonstrated by the successful analysis of blind audit samples according to the criteria set by this Program Plan. Intercomparison between measurement groups of the DOE complex will be achieved by comparing the results of measurements on similar or identical blind samples reported by the different measurement facilities. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess the performance of measurement groups regarding compliance with established Quality Assurance Objectives (QAOs). As defined for this program, a PDP sample consists of a 55-gallon matrix drum emplaced with radioactive standards and fabricated matrix inserts. These PDP sample components, once manufactured, will be secured and stored at each participating measurement facility designated and authorized by Carlsbad Area Office (CAO) under secure conditions to protect them from loss, tampering, or accidental damage

  6. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  7. MCNP Progress & Performance Improvements

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bull, Jeffrey S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-04-14

    Twenty-eight slides give information about the work of the US DOE/NNSA Nuclear Criticality Safety Program on MCNP6 under the following headings: MCNP6.1.1 Release, with ENDF/B-VII.1; Verification/Validation; User Support & Training; Performance Improvements; and Work in Progress. Whisper methodology will be incorporated into the code, and run speed should be increased.

  8. Evaluating Program about Performance of Circular Sodium Heat Pipe

    International Nuclear Information System (INIS)

    Kwak, Jae Sik; Kim, Hee Reyoung

    2014-01-01

    The superior heat transfer capability, structural simplicity, relatively inexpensive, insensitivity to the gravitational field, silence and reliability are some of its outstanding features. We study about heat transfer equation of heat pipe and program predicting performance which is considering geometrical shape of heat pipe by the related heat transfer equation of heat pipe. The operating temperature is 450 .deg. C - 950 .deg. C, working fluid is sodium, material for container is stainless steel, and type of wick is sintered metal. As a result of evaluating program about performance of circular sodium heat pipe based on MATLAB code, express correlation between radius and LHR, correlation between heat transfer length and LHR, correlation between wick and LHR, correlation between operating temperature and LHR. Generally radius values of heat pipe are proportional to LHR because of increase of mass flow which is main factor of heat flow. Heat transfer length values of heat pipe are inversely proportional to LHR and slightly inversely proportional to heat rate. Pore size is proportional to LHR. Although increase of pore size decrease capillary pressure, decrease more pressure drop in liquid phase. As a result, mass flow and heat rate are increase. But we have to do additional consideration about pore size and voidage in the aspect of safety and production technique

  9. Evaluating Program about Performance of Circular Sodium Heat Pipe

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Jae Sik; Kim, Hee Reyoung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2014-10-15

    The superior heat transfer capability, structural simplicity, relatively inexpensive, insensitivity to the gravitational field, silence and reliability are some of its outstanding features. We study about heat transfer equation of heat pipe and program predicting performance which is considering geometrical shape of heat pipe by the related heat transfer equation of heat pipe. The operating temperature is 450 .deg. C - 950 .deg. C, working fluid is sodium, material for container is stainless steel, and type of wick is sintered metal. As a result of evaluating program about performance of circular sodium heat pipe based on MATLAB code, express correlation between radius and LHR, correlation between heat transfer length and LHR, correlation between wick and LHR, correlation between operating temperature and LHR. Generally radius values of heat pipe are proportional to LHR because of increase of mass flow which is main factor of heat flow. Heat transfer length values of heat pipe are inversely proportional to LHR and slightly inversely proportional to heat rate. Pore size is proportional to LHR. Although increase of pore size decrease capillary pressure, decrease more pressure drop in liquid phase. As a result, mass flow and heat rate are increase. But we have to do additional consideration about pore size and voidage in the aspect of safety and production technique.

  10. Pecan Street Grid Demonstration Program. Final technology performance report

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-02-10

    This document represents the final Regional Demonstration Project Technical Performance Report (TPR) for Pecan Street Inc.’s (Pecan Street) Smart Grid Demonstration Program, DE-OE-0000219. Pecan Street is a 501(c)(3) smart grid/clean energy research and development organization headquartered at The University of Texas at Austin (UT). Pecan Street worked in collaboration with Austin Energy, UT, Environmental Defense Fund (EDF), the City of Austin, the Austin Chamber of Commerce and selected consultants, contractors, and vendors to take a more detailed look at the energy load of residential and small commercial properties while the power industry is undergoing modernization. The Pecan Street Smart Grid Demonstration Program signed-up over 1,000 participants who are sharing their home or businesses’s electricity consumption data with the project via green button protocols, smart meters, and/or a home energy monitoring system (HEMS). Pecan Street completed the installation of HEMS in 750 homes and 25 commercial properties. The program provided incentives to increase the installed base of roof-top solar photovoltaic (PV) systems, plug-in electric vehicles with Level 2 charging, and smart appliances. Over 200 participants within a one square mile area took advantage of Austin Energy and Pecan Street’s joint PV incentive program and installed roof-top PV as part of this project. Of these homes, 69 purchased or leased an electric vehicle through Pecan Street’s PV rebate program and received a Level 2 charger from Pecan Street. Pecan Street studied the impacts of these technologies along with a variety of consumer behavior interventions, including pricing models, real-time feedback on energy use, incentive programs, and messaging, as well as the corresponding impacts on Austin Energy’s distribution assets.The primary demonstration site was the Mueller community in Austin, Texas. The Mueller development, located less than three miles from the Texas State Capitol

  11. Verification of the cross-section and depletion chain processing module of DRAGON 3.06

    International Nuclear Information System (INIS)

    Chambon, R.; Marleau, G.; Zkiek, A.

    2008-01-01

    In this paper we present a verification of the module of the lattice code DRAGON 3.06 used for processing microscopic cross-section libraries, including their associated depletion chain. This verification is performed by reprogramming the capabilities of DRAGON in another language (MATLAB) and testing them on different problems typical of the CANDU reactor. The verification procedure consists in first programming MATLAB m-files to read the different cross section libraries in ASCII format and to compute the reference cross-sections and depletion chains. The same information is also recovered from the output files of DRAGON (using different m-files) and the resulting cross sections and depletion chain are compared with the reference library, the differences being evaluated and tabulated. The results show that the cross-section calculations and the depletion chains are correctly processed in version 3.06 of DRAGON. (author)

  12. Performance Demonstration Program Plan for Nondestructive Assay of Drummed Wastes for the TRU Waste Characterization Program

    International Nuclear Information System (INIS)

    2009-01-01

    Each testing and analytical facility performing waste characterization activities for the Waste Isolation Pilot Plant (WIPP) participates in the Performance Demonstration Program (PDP) to comply with the Transuranic Waste Acceptance Criteria for the Waste Isolation Pilot Plant (WAC) (DOE/WIPP-02-3122) and the Quality Assurance Program Document (QAPD) (CBFO-94-1012). The PDP serves as a quality control check for data generated in the characterization of waste destined for WIPP. Single blind audit samples are prepared and distributed to each of the facilities participating in the PDP. The PDP evaluates analyses of simulated headspace gases, constituents of the Resource Conservation and Recovery Act (RCRA), and transuranic (TRU) radionuclides using nondestructive assay (NDA) techniques.

  13. A FRAMEWORK FOR PERFORMANCE EVALUATION AND MONITORING OF PUBLIC HEALTH PROGRAM USING COMPOSITE PERFORMANCE INDEX

    Directory of Open Access Journals (Sweden)

    Susanta Kumar Gauri

    2017-12-01

    Full Text Available A public health program (PHP taken up by the government of a country refers to all organized measures to prevent disease and promote health among the population, by providing different planned cares/services to the people. Usually, the target population for different PHP are different. The basic requirement for success of a PHP is to ensure that all the planned cares/services are reached to each member of the target population. Therefore, the important performance measures for a PHP are the implementation status of all the planned cares/services under the PHP. However, management and monitoring of a PHP become quite difficult by interpreting separately the information contained in a large number of performance measures. Therefore, usually a metric, called composite performance index (CPI, is evaluated to understand the overall performance of a PHP. However, due a scaling operation involved in the CPI computation procedure, the CPI value does not reveal the true overall implementation status of a PHP and consequently, it is effective for management of a PHP. This paper presents a new approach for CPI computation, in which scaling/normalization of the performance variables is not required and therefore, it can be used for monitoring the true overall implementation status of a PHP in a region. A systematic approach for monitoring a PHP using the CPI values is proposed and applied for monitoring the maternal and child healthcare (MCH program. The results are found effective towards continuous improvement of implementation status.

  14. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  15. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  16. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  17. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  18. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  19. NRC valve performance test program - check valve testing

    International Nuclear Information System (INIS)

    Jeanmougin, N.M.

    1987-01-01

    The Valve Performance Test Program addresses the current requirements for testing of pressure isolation valves (PIVs) in light water reactors. Leak rate monitoring is the current method used by operating commercial power plants to survey the condition of their PIVs. ETEC testing of three check valves (4-inch, 6-inch, and 12-inch nominal diameters) indicates that leak rate testing is not a reliable method for detecting impending valve failure. Acoustic emission monitoring of check valves shows promise as a method of detecting loosened internals damage. Future efforts will focus on evaluation of acoustic emission monitoring as a technique for determining check valve condition. Three gate valves also will be tested to evaluate whether the check valve results are applicable to gate type PIVs

  20. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    Science.gov (United States)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced

  1. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  2. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  3. Performance Demonstration Program Plan for Nondestructive Assay of Drummed Wastes for the TRU Waste Characterization Program

    International Nuclear Information System (INIS)

    2005-01-01

    The Performance Demonstration Program (PDP) for Nondestructive Assay (NDA) is a test program designed to yield data on measurement system capability to characterize drummed transuranic (TRU) waste generated throughout the Department of Energy (DOE) complex. The tests are conducted periodically and provide a mechanism for the independent and objective assessment of NDA system performance and capability relative to the radiological characterization objectives and criteria of the Office of Characterization and Transportation (OCT). The primary documents requiring an NDA PDP are the Waste Acceptance Criteria for the Waste Isolation Pilot Plant (WAC), which requires annual characterization facility participation in the PDP, and the Quality Assurance Program Document (QAPD). This NDA PDP implements the general requirements of the QAPD and applicable requirements of the WAC. Measurement facilities must demonstrate acceptable radiological characterization performance through measurement of test samples comprised of pre-specified PDP matrix drum/radioactive source configurations. Measurement facilities are required to analyze the NDA PDP drum samples using the same procedures approved and implemented for routine operational waste characterization activities. The test samples provide an independent means to assess NDA measurement system performance and compliance per criteria delineated in the NDA PDP Plan. General inter-comparison of NDA measurement system performance among DOE measurement facilities and commercial NDA services can also be evaluated using measurement results on similar NDA PDP test samples. A PDP test sample consists of a 55-gallon matrix drum containing a waste matrix type representative of a particular category of the DOE waste inventory and nuclear material standards of known radionuclide and isotopic composition typical of DOE radioactive material. The PDP sample components are made available to participating measurement facilities as designated by the

  4. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  5. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  6. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    Energy Technology Data Exchange (ETDEWEB)

    Itano, M; Yamazaki, T [Inagi Municipal Hospital, Inagi, Tokyo (Japan); Tachibana, R; Uchida, Y [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Yamashita, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Shimizu, H [Kitasato University Medical Center, Kitamoto, Saitama (Japan); Sugawara, Y; Kotabe, K [National Center for Global Health and Medicine, Shinjuku, Tokyo (Japan); Kamima, T [Cancer Institute Hospital Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Ishibashi, S [Sasebo City General Hospital, Sasebo, Nagasaki (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  7. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Tachibana, R; Uchida, Y; Yamashita, M; Shimizu, H; Sugawara, Y; Kotabe, K; Kamima, T; Takahashi, R; Ishibashi, S; Tachibana, H

    2016-01-01

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently, patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)

  8. Non-performance of the Severance Pay Program in Slovenia

    Directory of Open Access Journals (Sweden)

    Milan Vodopivec

    2009-03-01

    Full Text Available Combining information from the Firm Survey of Labor Costs with the information about claims filed with the Guarantee Fund by workers whose employers defaulted on their severance pay obligations, the paper analyzes the so-called non-performance problem of severance pay – the fact that coverage, and thus legal entitlement, does not guarantee the actual receipt of the benefit – as experienced in Slovenia in 2000. The findings are threefold: (i one-third of total obligations incurred by firms failed to be honored and only a small portion of defaulted severance pay claims was reimbursed by the Guarantee Fund; (ii while both men and women seem to be equally affected, workers older than 40 were disproportionally represented among those whose severance pay claims failed to be honored; and, (iii among firms that incurred severance pay liabilities, larger and more productive firms were more likely to observe their fiduciary obligations and pay them out. These findings corroborate the weaknesses of severance pay as an income protection program, pointing to the large scale of the non-performance problem and the inequities created by it.

  9. Hanford Site performance summary -- EM funded programs, July 1995

    International Nuclear Information System (INIS)

    Schultz, E.A.

    1995-07-01

    Performance data for July 1995 reflects a 4% unfavorable schedule variance and is an improvement over June 1995. The majority of the behind schedule condition is attributed to EM-30, (Office of Waste Management). The majority of the EM-30 schedule variance is associated with the Tank Waste Remediation System (TWRS) Program. The TWRS schedule variance is attributed to the delay in obtaining key decision 0 (KD-0) for Project W-314, ''Tank Farm Restoration and Safe Operations'' and the Multi-Function Waste Tank Facility (MWTF) workscope still being a part of the baseline. Baseline Change Requests (BCRs) are in process rebaselining Project W-314 and deleting the MWTF from the TWRS baseline. Once the BCR's are approved and implemented, the overall schedule variance will be reduced to $15.0 million. Seventy-seven enforceable agreement milestones were scheduled FYTD. Seventy-one (92%) of the seventy-seven were completed on or ahead of schedule, two were completed late and four are delinquent. Performance data reflects a continued significant favorable cost variance of $124.3 million (10%). The cost variance is attributed to process improvements/efficiencies, elimination of low-value work, workforce reductions and is expected to continue for the remainder of this fiscal year. A portion of the cost variance is attributed to a delay in billings which should self-correct by fiscal year-end

  10. Performance evaluation of scientific programs on advanced architecture computers

    International Nuclear Information System (INIS)

    Walker, D.W.; Messina, P.; Baille, C.F.

    1988-01-01

    Recently a number of advanced architecture machines have become commercially available. These new machines promise better cost-performance then traditional computers, and some of them have the potential of competing with current supercomputers, such as the Cray X/MP, in terms of maximum performance. This paper describes an on-going project to evaluate a broad range of advanced architecture computers using a number of complete scientific application programs. The computers to be evaluated include distributed- memory machines such as the NCUBE, INTEL and Caltech/JPL hypercubes, and the MEIKO computing surface, shared-memory, bus architecture machines such as the Sequent Balance and the Alliant, very long instruction word machines such as the Multiflow Trace 7/200 computer, traditional supercomputers such as the Cray X.MP and Cray-2, and SIMD machines such as the Connection Machine. Currently 11 application codes from a number of scientific disciplines have been selected, although it is not intended to run all codes on all machines. Results are presented for two of the codes (QCD and missile tracking), and future work is proposed

  11. Tuberculosis control program in the municipal context: performance evaluation.

    Science.gov (United States)

    Arakawa, Tiemi; Magnabosco, Gabriela Tavares; Andrade, Rubia Laine de Paula; Brunello, Maria Eugenia Firmino; Monroe, Aline Aparecida; Ruffino-Netto, Antonio; Scatena, Lucia Marina; Villa, Tereza Cristina Scatena

    2017-03-30

    The objective of this study is to evaluate the performance of the Tuberculosis Control Program in municipalities of the State of São Paulo. This is a program evaluation research, with ecological design, which uses three non-hierarchical groups of the municipalities of the State of São Paulo according to their performance in relation to operational indicators. We have selected 195 municipalities with at least five new cases of tuberculosis notified in the Notification System of the State of São Paulo and with 20,000 inhabitants or more in 2010. The multiple correspondence analysis was used to identify the association between the groups of different performances, the epidemiological and demographic characteristics, and the characteristics of the health systems of the municipalities. The group with the worst performance showed the highest rates of abandonment (average [avg] = 10.4, standard deviation [sd] = 9.4) and the lowest rates of supervision of Directly Observed Treatment (avg = 6.1, sd = 12.9), and it was associated with low incidence of tuberculosis, high tuberculosis and HIV, small population, high coverage of the Family Health Strategy/Program of Community Health Agents, and being located on the countryside. The group with the best performance presented the highest cure rate (avg = 83.7, sd = 10.5) and the highest rate of cases in Directly Observed Treatment (avg = 83.0, sd = 12.7); the group of regular performance showed regular results for outcome (avg cure = 79.8, sd = 13.2; abandonment avg = 9.5, sd = 8.3) and supervision of the Directly Observed Treatment (avg = 42.8, sd = 18.8). Large population, low coverage of the Family Health Strategy/Program of Community Health Agents, high incidence of tuberculosis and AIDS, and being located on the coast and in metropolitan areas were associated with these groups. The findings highlight the importance of the Directly Observed Treatment in relation to the outcome for treatment and raise reflections on the

  12. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  13. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  14. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  15. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  16. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  17. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  18. Evaluating computer program performance on the CRAY-1

    International Nuclear Information System (INIS)

    Rudsinski, L.; Pieper, G.W.

    1979-01-01

    The Advanced Scientific Computers Project of Argonne's Applied Mathematics Division has two objectives: to evaluate supercomputers and to determine their effect on Argonne's computing workload. Initial efforts have focused on the CRAY-1, which is the only advanced computer currently available. Users from seven Argonne divisions executed test programs on the CRAY and made performance comparisons with the IBM 370/195 at Argonne. This report describes these experiences and discusses various techniques for improving run times on the CRAY. Direct translations of code from scalar to vector processor reduced running times as much as two-fold, and this reduction will become more pronounced as the CRAY compiler is developed. Further improvement (two- to ten-fold) was realized by making minor code changes to facilitate compiler recognition of the parallel and vector structure within the programs. Finally, extensive rewriting of the FORTRAN code structure reduced execution times dramatically, in three cases by a factor of more than 20; and even greater reduction should be possible by changing algorithms within a production code. It is condluded that the CRAY-1 would be of great benefit to Argonne researchers. Existing codes could be modified with relative ease to run significantly faster than on the 370/195. More important, the CRAY would permit scientists to investigate complex problems currently deemed infeasibile on traditional scalar machines. Finally, an interface between the CRAY-1 and IBM computers such as the 370/195, scheduled by Cray Research for the first quarter of 1979, would considerably facilitate the task of integrating the CRAY into Argonne's Central Computing Facility. 13 tables

  19. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  20. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  1. Ostomy Home Skills Program

    Medline Plus

    Full Text Available ... Verification, Review, and Consultation Program for Hospitals Trauma Systems Consultation Program Trauma Education Achieving Zero Preventable Deaths Conference Publications and Posters ...

  2. Ostomy Home Skills Program

    Medline Plus

    Full Text Available ... in Trauma Surgery Advanced Trauma Life Support Verification, Review, and Consultation Program for Hospitals Trauma Systems Consultation Program Trauma Education Achieving Zero Preventable Deaths ...

  3. Ostomy Home Skills Program

    Medline Plus

    Full Text Available ... Validation Programs Accreditation, Verification, and Validation Programs Accredited Education Institutes ... Entering Resident Readiness Assessment Evidence-Based Decisions in ...

  4. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  5. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  6. Super Energy Savings Performance Contracts: Federal Energy Management Program (FEMP) Program Overview (revision)

    International Nuclear Information System (INIS)

    Pitchford, P.

    2001-01-01

    This four-page publication describes the U.S. Department of Energy's (DOE's) streamlined energy savings performance contracting, or ''Super ESPC,'' process, which is managed by DOE's Federal Energy Management Program (FEMP). Under a Super ESPC, a qualifying energy service company (ESCO) from the private sector pays for energy efficiency improvements or advanced renewable energy technologies (e.g., photovoltaic systems, wind turbines, or geothermal heat pumps, among others) for a facility of a government agency. The ESCO is then repaid over time from the agency's resulting energy cost savings. Delivery orders under these contracts specify the level of performance (energy savings) and the repayment schedule; the contract term can be up to 25 years, although many Super ESPCs are for about 10 years or less

  7. A performance analysis for evaluation of programming languages ...

    African Journals Online (AJOL)

    In Nigeria, several programming Languages exist from general purpose to special purpose programming languages that are used in one application domain. People always find difficulties about which programming language should be learnt and adopt to develop particular software. In this paper, three (3) most commonly ...

  8. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  9. Results of the independent radiological verification survey of the remedial action performed at the former Alba Craft Laboratory site, Oxford, Ohio, (OXO001)

    International Nuclear Information System (INIS)

    Kleinhans, K.R.; Murray, M.E.; Carrier, R.F.

    1996-04-01

    Between October 1952 and February 1957, National Lead of Ohio (NLO), a primary contractor for the Atomic Energy Commission (AEC), subcontracted certain uranium machining operations to Alba Craft Laboratory, Incorporated, located at 10-14 West Rose Avenue, Oxford, Ohio. In 1992, personnel from Oak Ridge National Laboratory (ORNL) confirmed the presence of residual radioactive materials from the AEC-related operations in and around the facility in amounts exceeding the applicable Department of Energy (DOE) guidelines. Although the amount of uranium found on the property posed little health hazard if left undisturbed, the levels were sufficient to require remediation to bring radiological conditions into compliance with current guidelines, thus ensuring that the public and the environment are protected. A team from ORNL conducted a radiological verification survey of the former Alba Craft Laboratory property between December 1994 and February 1995. The survey was conducted at the request of DOE and included directly measured radiation levels, the collection and analysis of soil samples to determine concentrations of uranium and certain other radionuclides, and comparison of these data to the guidelines. This document reports the findings of this survey. The results of the independent verification survey of the former Alba Craft Laboratory property demonstrate that all contaminated areas have been remediated to radionuclide concentrations and activity levels below the applicable guideline limits set by DOE

  10. Environmental Technology Verification Report: Taconic Energy, Inc. TEA Fuel Additive

    Science.gov (United States)

    The Greenhouse Gas Technology Center (GHG Center) is one of six verification organizations operating under EPA’s ETV program. One sector of significant interest to GHG Center stakeholders is transportation - particularly technologies that result in fuel economy improvements. Taco...

  11. Computer-Assisted Program Reasoning Based on a Relational Semantics of Programs

    Directory of Open Access Journals (Sweden)

    Wolfgang Schreiner

    2012-02-01

    Full Text Available We present an approach to program reasoning which inserts between a program and its verification conditions an additional layer, the denotation of the program expressed in a declarative form. The program is first translated into its denotation from which subsequently the verification conditions are generated. However, even before (and independently of any verification attempt, one may investigate the denotation itself to get insight into the "semantic essence" of the program, in particular to see whether the denotation indeed gives reason to believe that the program has the expected behavior. Errors in the program and in the meta-information may thus be detected and fixed prior to actually performing the formal verification. More concretely, following the relational approach to program semantics, we model the effect of a program as a binary relation on program states. A formal calculus is devised to derive from a program a logic formula that describes this relation and is subject for inspection and manipulation. We have implemented this idea in a comprehensive form in the RISC ProgramExplorer, a new program reasoning environment for educational purposes which encompasses the previously developed RISC ProofNavigator as an interactive proving assistant.

  12. Performance Demonstration Program Plan for Nondestructive Assay of Drummed Wastes for the TRU Waste Characterization Program

    International Nuclear Information System (INIS)

    DOE Carlsbad Field Office

    2001-01-01

    The Performance Demonstration Program (PDP) for nondestructive assay (NDA) consists of a series of tests to evaluate the capability for NDA of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each test is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements obtained from NDA systems used to characterize the radiological constituents of TRU waste. The primary documents governing the conduct of the PDP are the Waste Acceptance Criteria for the Waste Isolation Pilot Plant (WAC; DOE 1999a) and the Quality Assurance Program Document (QAPD; DOE 1999b). The WAC requires participation in the PDP; the PDP must comply with the QAPD and the WAC. The WAC contains technical and quality requirements for acceptable NDA. This plan implements the general requirements of the QAPD and applicable requirements of the WAC for the NDA PDP. Measurement facilities demonstrate acceptable performance by the successful testing of simulated waste containers according to the criteria set by this PDP Plan. Comparison among DOE measurement groups and commercial assay services is achieved by comparing the results of measurements on similar simulated waste containers reported by the different measurement facilities. These tests are used as an independent means to assess the performance of measurement groups regarding compliance with established quality assurance objectives (QAO's). Measurement facilities must analyze the simulated waste containers using the same procedures used for normal waste characterization activities. For the drummed waste PDP, a simulated waste container consists of a 55-gallon matrix drum emplaced with radioactive standards and fabricated matrix inserts. These PDP sample components are distributed to the participating measurement facilities that have been designated and authorized by the Carlsbad Field Office (CBFO). The NDA Drum PDP materials are stored at these sites under secure conditions to

  13. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  14. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  15. Identifying Enterprise Leverage Points in Defense Acquisition Program Performance

    Science.gov (United States)

    2009-09-01

    differentiated . [108] Table 1: Table of Validation and Approval Authority5 Beyond the major categories used for programs as noted above, there is also a...impossible to identify which “ uber -portfolio” a system should belong to as many “portfolios” claim a system as an integral part of the larger portfolio...to differentiate between programs. DOD 5002, Enclosure E states “A technology project or acquisition program shall be categorized based on its

  16. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  17. Darlington refurbishment - performance improvement programs goals and experience

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, N. [Ontario Power Generation, Toronto, ON (Canada)

    2015-07-01

    This paper discusses the refurbishment program at the Darlington site. The program focuses on safety, integrity, excellence and personnel. Worker safety and public safety are of the highest priority. Success resulted from collaborative engineering interface, collaborative front end planning, highly competent people and respectful relationship with partners and regulators.

  18. Analysis performed in cooperation with the SALE program, (1)

    International Nuclear Information System (INIS)

    Tuboya, Takao; Wada, Yukio; Suzuki, Takeshi

    1978-01-01

    One of the objects of the SALE (Safeguard Analytical Laboratory Evaluation) program is a development of technique in safeguard and accountability. The SALE program was established by the United States Atomic Energy Commission's New Brunswick Laboratory in 1970. Six years later, SALE program has grown into a worldwide quality control program, receiving analysis results from about 60 laboratories that includes 19 non-U.S. laboratories. All laboratories, participating at present or in the past in the SALE program are listed in Table 1. By 1973, the program was expanded to include six different materials; uranium dioxide (UO 2 ), uranyl nitrate (U-NO 3 ), plutonium dioxide (PuO 2 ), plutonium nitrate (Pu-NO 3 ), uranium-plutonium mixed oxides [(Pu,U)O 2 ], and uranium-plutonium mixed nitrates (Pu-U-NO 3 ). PNC has joined in this program in 1975 for the analysis of samples shown in Table 2. SALE program participants analyze, on a bimonthly basis, materials supplied by the New Brunswick Laboratory (NBL) and report measurement results to NBL for evaluation and inclusion in the bimonthly reports. Present paper describes analysis result and evaluations for these samples which were measured in 1975 -- 1976. (author)

  19. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  20. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  1. Military Personnel: Performance Measures Needed to Determine How Well DOD’s Credentialing Program Helps Servicemembers

    Science.gov (United States)

    2016-10-01

    MILITARY PERSONNEL Performance Measures Needed to Determine How Well DOD’s Credentialing Program Helps Servicemembers...Measures Needed to Determine How Well DOD’s Credentialing Program Helps Servicemembers What GAO Found The Department of Defense (DOD) has taken steps to...establish the statutorily required credentialing program, but it has not developed performance measures to gauge the program’s effectiveness

  2. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  3. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  4. Taiwan Power Company's power distribution analysis and fuel thermal margin verification methods for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, P.H.

    1995-01-01

    Taiwan Power Company's (TPC's) power distribution analysis and fuel thermal margin verification methods for pressurized water reactors (PWRs) are examined. The TPC and the Institute of Nuclear Energy Research started a joint 5-yr project in 1989 to establish independent capabilities to perform reload design and transient analysis utilizing state-of-the-art computer programs. As part of the effort, these methods were developed to allow TPC to independently perform verifications of the local power density and departure from nucleate boiling design bases, which are required by the reload safety evaluation for the Maanshan PWR plant. The computer codes utilized were extensively validated for the intended applications. Sample calculations were performed for up to six reload cycles of the Maanshan plant, and the results were found to be quite consistent with the vendor's calculational results

  5. An unattended verification station for UF6 cylinders: Field trial findings

    Science.gov (United States)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  6. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  10. a performance analysis for evaluation of programming languages ...

    African Journals Online (AJOL)

    Mohammed et al.

    PROGRAMMING LANGUAGES BASED ON MOBILE COMPUTING. FOR NIGERIA ... Finally, Vb.net is suitable for data Transfer using upload scheme. Keywords: ... INTRODUCTION .... java, Julia, python, matlab, mathematica and Ruby by.

  11. Contrasting the capabilities of building energy performance simulation programs

    Energy Technology Data Exchange (ETDEWEB)

    Crawley, Drury B. [US Department of Energy, Washington, DC (United States); Hand, Jon W. [University of Strathclyde, Glasgow, Scotland (United Kingdom). Energy Systems Research Unit; Kummert, Michael [University of Wisconsin-Madison (United States). Solar Energy Laboratory; Griffith, Brent T. [National Renewable Energy Laboratory, Golden, CO (United States)

    2008-04-15

    For the past 50 years, a wide variety of building energy simulation programs have been developed, enhanced and are in use throughout the building energy community. This paper is an overview of a report, which provides up-to-date comparison of the features and capabilities of twenty major building energy simulation programs. The comparison is based on information provided by the program developers in the following categories: general modeling features; zone loads; building envelope and daylighting and solar; infiltration, ventilation and multizone airflow; renewable energy systems; electrical systems and equipment; HVAC systems; HVAC equipment; environmental emissions; economic evaluation; climate data availability, results reporting; validation; and user interface, links to other programs, and availability. (author)

  12. Performance Demonstration Program Plan for Nondestructive Assay of Boxed Wastes for the TRU Waste Characterization Program

    International Nuclear Information System (INIS)

    2001-01-01

    The Performance Demonstration Program (PDP) for nondestructive assay (NDA) consists of a series of tests to evaluate the capability for NDA of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each test is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements obtained from NDA systems used to characterize the radiological constituents of TRU waste. The primary documents governing the conduct of the PDP are the Waste Acceptance Criteria for the Waste Isolation Pilot Plant (WAC; DOE 1999a) and the Quality Assurance Program Document (QAPD; DOE 1999b). The WAC requires participation in the PDP; the PDP must comply with the QAPD and the WAC. The WAC contains technical and quality requirements for acceptable NDA. This plan implements the general requirements of the QAPD and applicable requirements of the WAC for the NDA PDP for boxed waste assay systems. Measurement facilities demonstrate acceptable performance by the successful testing of simulated waste containers according to the criteria set by this PDP Plan. Comparison among DOE measurement groups and commercial assay services is achieved by comparing the results of measurements on similar simulated waste containers reported by the different measurement facilities. These tests are used as an independent means to assess the performance of measurement groups regarding compliance with established quality assurance objectives (QAO's). Measurement facilities must analyze the simulated waste containers using the same procedures used for normal waste characterization activities. For the boxed waste PDP, a simulated waste container consists of a modified standard waste box (SWB) emplaced with radioactive standards and fabricated matrix inserts. An SWB is a waste box with ends designed specifically to fit the TRUPACT-II shipping container. SWB's will be used to package a substantial volume of the TRU waste for disposal. These PDP sample components

  13. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  14. Designing PV Incentive Programs to Promote Performance: A Reviewof Current Practice

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen; Wiser, Ryan; Bolinger, Mark

    2007-06-01

    Increasing levels of financial support for customer-sited photovoltaic (PV) systems, provided through publicly-funded incentive programs, has heightened concerns about the long-term performance of these systems. Given the barriers that customers face to ensuring that their PV systems perform well, and the responsibility that PV incentive programs bear to ensure that public funds are prudently spent, these programs should, and often do, play a critical role in ensuring that PV systems receiving incentives perform well. To provide a point of reference for assessing the current state of the art, and to inform program design efforts going forward, we examine the approaches to encouraging PV system performance used by 32 prominent PV incentive programs in the U.S. We identify eight general strategies or groups of related strategies that these programs have used to address performance issues, and highlight important differences in the implementation of these strategies among programs.

  15. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES--PUREM NORTH AMERICA LLC, PMF GREENTEC 1004205.00.0 DIESEL PARTICULATE FILTER

    Science.gov (United States)

    The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...

  17. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  18. 14 CFR 152.319 - Monitoring and reporting of program performance.

    Science.gov (United States)

    2010-01-01

    ... performance. 152.319 Section 152.319 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRPORTS AIRPORT AID PROGRAM Accounting and Reporting Requirements § 152.319 Monitoring and reporting of program performance. (a) The sponsor or planning agency shall monitor performance...

  19. River Protection Project waste feed delivery program technical performance measurement assessment plan

    International Nuclear Information System (INIS)

    O'TOOLE, S.M.

    1999-01-01

    This plan establishes a formal technical performance-monitoring program. Technical performance is assessed by establishing requirements based performance goals at the beginning of a program and routinely evaluating progress in meeting these goals at predetermined milestones throughout the project life cycle

  20. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently