WorldWideScience

Sample records for final verification success

  1. Final Verification Success Story Using the Triad Approach at the Oak Ridge National Laboratory's Melton Valley Soils and Sediment Project

    International Nuclear Information System (INIS)

    King, D.A.; Haas, D.A.; Cange, J.B.

    2006-01-01

    back to the 1980's, and it contained no radiation measurement data. The result of this verification effort is a dataset of sufficient quantity and quality to demonstrate compliance with Project criteria and one that withstands Core Team scrutiny. (authors)

  2. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  3. Preparation of the accounting entity for verification of the final accounts

    OpenAIRE

    Kučerová, Monika

    2009-01-01

    Bachelor's thesis deals with preparation of the accounting entity for verification of the final accounts. The work includes the definition of the accounting entity, also includes information about preparation of final accounts and deals with report of auditor.

  4. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  5. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  6. Determinants of Business Success – Theoretical Model and Empirical Verification

    Directory of Open Access Journals (Sweden)

    Kozielski Robert

    2016-12-01

    Full Text Available Market knowledge, market orientation, learning competencies, and a business performance were the key issues of the research project conducted in the 2006 study. The main findings identified significant relationships between the independent variables (market knowledge, market orientation, learning competencies and the dependent variables (business success. A partial correlation analysis indicated that a business success primarily relies on organisational learning competencies. Organisational learning competencies, to a large extent (almost 60%, may be explained by the level of corporate market knowledge and market orientation. The aim of the paper is to evaluate to what extent the relationships between the variables are still valid. The research was based on primary and secondary data sources. The major field of the research was carried out in the form of quantitative studies. The results of the 2014 study are consistent with the previous (2006 results.

  7. Simulated coal gas MCFC power plant system verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  8. Metering and energy audit : the key to success; Mesurage et verification energetique : la cle du succes

    Energy Technology Data Exchange (ETDEWEB)

    Milot, J. [Econoler, Quebec, PQ (Canada)

    2010-01-01

    The most widely used measurement and verification (M and V) procedure for energy performance contracting (EPC) projects is the International Performance Measurement and Verification Protocol (IPMVP). This article discussed the IPMVP as a tool for measuring the success of energy efficiency projects. The IPMVP provides an overview of current best practice techniques available for verifying results of energy efficiency, water efficiency and renewable energy projects in commercial and industrial facilities. Facility operators can use the IPMVP to evaluate and improve facility performance. Energy conservation measures (ECMs) covered in the protocol include fuel saving measures, water efficiency measures, load shifting and energy reductions through installation or retrofit of equipment, or modification of operating procedures. 2 figs.

  9. Getting ready for final disposal in Finland - Independent verification of spent fuel

    International Nuclear Information System (INIS)

    Tarvainen, Matti; Honkamaa, Tapani; Martikka, Elina; Varjoranta, Tero; Hautamaeki, Johanna; Tiitta, Antero

    2001-01-01

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  10. Megavoltage conebeam CT cine as final verification of treatment plan in lung stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Kudithipudi, Vijay; Gayou, Olivier; Colonias, Athanasios

    2016-01-01

    To analyse the clinical impact of megavoltage conebeam computed tomography (MV-CBCT) cine on internal target volume (ITV) coverage in lung stereotactic body radiotherapy (SBRT). One hundred and six patients received lung SBRT. All underwent 4D computed tomography simulation followed by treatment via image guided 3D conformal or intensity modulated radiation. Prior to SBRT, all patients underwent MV-CBCT cine, in which raw projections are displayed as beam's-eye-view fluoroscopic series with the planning target volume (PTV) projected onto each image, enabling verification of tumour motion relative to the PTV and assessment of adequacy of treatment margin. Megavoltage conebeam computed tomography cine was completed 1–2 days prior to SBRT. Four patients (3.8%) had insufficient ITV coverage inferiorly at cine review. All four plans were changed by adding 5 mm on the PTV margin inferiorly. The mean change in PTV volumes was 3.9 cubic centimetres (cc) (range 1.85–6.32 cc). Repeat cine was performed after plan modification to ensure adequate PTV coverage in the modified plans. PTV margin was adequate in the majority of patients with this technique. MV-CBCT cine did show insufficient coverage in a small subset of patients. Insufficient PTV margins may be a function of 4D CT simulation inadequacies or deficiencies in visualizing the ITV inferior border in the full-inhale phase. MV-CBCT cine is a valuable tool for final verification of PTV margins.

  11. Megavoltage conebeam CT cine as final verification of treatment plan in lung stereotactic body radiotherapy.

    Science.gov (United States)

    Kudithipudi, Vijay; Gayou, Olivier; Colonias, Athanasios

    2016-06-01

    To analyse the clinical impact of megavoltage conebeam computed tomography (MV-CBCT) cine on internal target volume (ITV) coverage in lung stereotactic body radiotherapy (SBRT). One hundred and six patients received lung SBRT. All underwent 4D computed tomography simulation followed by treatment via image guided 3D conformal or intensity modulated radiation. Prior to SBRT, all patients underwent MV-CBCT cine, in which raw projections are displayed as beam's-eye-view fluoroscopic series with the planning target volume (PTV) projected onto each image, enabling verification of tumour motion relative to the PTV and assessment of adequacy of treatment margin. Megavoltage conebeam computed tomography cine was completed 1-2 days prior to SBRT. Four patients (3.8%) had insufficient ITV coverage inferiorly at cine review. All four plans were changed by adding 5 mm on the PTV margin inferiorly. The mean change in PTV volumes was 3.9 cubic centimetres (cc) (range 1.85-6.32 cc). Repeat cine was performed after plan modification to ensure adequate PTV coverage in the modified plans. PTV margin was adequate in the majority of patients with this technique. MV-CBCT cine did show insufficient coverage in a small subset of patients. Insufficient PTV margins may be a function of 4D CT simulation inadequacies or deficiencies in visualizing the ITV inferior border in the full-inhale phase. MV-CBCT cine is a valuable tool for final verification of PTV margins. © 2016 The Royal Australian and New Zealand College of Radiologists.

  12. Fundamentals of successful monitoring, reporting, and verification under a cap-and-trade program

    Energy Technology Data Exchange (ETDEWEB)

    John Schakenbach; Robert Vollaro; Reynaldo Forte [U.S. Environmental Protection Agency, Office of Atmospheric Programs, Washington, DC (United States)

    2006-11-15

    The U.S. Environmental Protection Agency (EPA) developed and implemented the Acid Rain Program (ARP), and NOx Budget Trading Programs (NBTP) using several fundamental monitoring, reporting, and verification (MRV) elements: (1) compliance assurance through incentives and automatic penalties; (2) strong quality assurance (QA); (3) collaborative approach with a petition process; (4) standardized electronic reporting; (5) compliance flexibility for low-emitting sources; (6) complete emissions data record required; (7) centralized administration; (8) level playing field; (9) publicly available data; (10) performance-based approach; and (11) reducing conflicts of interest. Each of these elements is discussed in the context of the authors' experience under two U.S. cap-and-trade programs and their potential application to other cap and-trade programs. The U.S. Office of Management and Budget found that the Acid Rain Program has accounted for the largest quantified human health benefits of any federal regulatory program implemented in the last 10 yr, with annual benefits exceeding costs by {gt} 40 to 1. The authors believe that the elements described in this paper greatly contributed to this success. EPA has used the ARP fundamental elements as a model for other cap-and-trade programs, including the NBTP, which went into effect in 2003, and the recently published Clean Air Interstate Rule and Clean Air Mercury Rule. The authors believe that using these fundamental elements to develop and implement the MRV portion of their cap-and-trade programs has resulted in public confidence in the programs, highly accurate and complete emissions data, and a high compliance rate. 2 refs.

  13. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  14. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    International Nuclear Information System (INIS)

    Weaver, P.C.

    2009-01-01

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified 'hot spot' cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: (1) performing radiological walkover surveys, and (2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site was removed in its entirety.

  15. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  16. Final Report for 'Verification and Validation of Radiation Hydrodynamics for Astrophysical Applications'

    International Nuclear Information System (INIS)

    Zingale, M.; Howell, L.H.

    2010-01-01

    The motivation for this work is to gain experience in the methodology of verification and validation (V and V) of astrophysical radiation hydrodynamics codes. In the first period of this work, we focused on building the infrastructure to test a single astrophysical application code, Castro, developed in collaboration between Lawrence Livermore National Laboratory (LLNL) and Lawrence Berkeley Laboratory (LBL). We delivered several hydrodynamic test problems, in the form of coded initial conditions and documentation for verification, routines to perform data analysis, and a generalized regression test suite to allow for continued automated testing. Astrophysical simulation codes aim to model phenomena that elude direct experimentation. Our only direct information about these systems comes from what we observe, and may be transient. Simulation can help further our understanding by allowing virtual experimentation of these systems. However, to have confidence in our simulations requires us to have confidence in the tools we use. Verification and Validation is a process by which we work to build confidence that a simulation code is accurately representing reality. V and V is a multistep process, and is never really complete. Once a single test problem is working as desired (i.e. that problem is verified), one wants to ensure that subsequent code changes do not break that test. At the same time, one must also search for new verification problems that test the code in a new way. It can be rather tedious to manually retest each of the problems, so before going too far with V and V, it is desirable to have an automated test suite. Our project aims to provide these basic tools for astrophysical radiation hydrodynamics codes.

  17. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    International Nuclear Information System (INIS)

    Yang, Jun Un; Jung, Kwang Sup; Park, Chang Gyu

    1992-08-01

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  18. Viability Study for an Unattended UF_6 Cylinder Verification Station: Phase I Final Report

    International Nuclear Information System (INIS)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.; Branney, Sean; McDonald, Benjamin S.; Webster, Jennifer B.; Zalavadia, Mital A.; Todd, Lindsay C.; Kulisek, Jonathan A.; Nordquist, Heather; Deshmukh, Nikhil S.; Stewart, Scott

    2016-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, "2"3"5U mass, total uranium mass and identification for all declared UF_6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 ''typical'' Type 30B cylinders, and the viability of an ''NDA Fingerprint'' concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument

  19. Methodology and applicability of a safety and demonstration concept for a HAW final repository on clays. Safety concept and verification strategy

    International Nuclear Information System (INIS)

    Ruebel, Andre; Meleshyn, Artur

    2014-08-01

    The report describes the site independent frame for a safety concept and verification strategy for a final repository for heat generating wastes in clay rock. In the safety concept planning specifications and technical measures are summarized that are supposed to allow a safe inclusion of radionuclides in the host rock. The verification strategy defines the systematic procedures for the development of fundamentals and scenarios as basis for the demonstration of the safety case and to allow the prognosis of appropriateness. The report includes the boundary conditions, the safety concept for the post-closure phase and the verification strategy for the post-closure phase.

  20. Improvement and verification of fast-reactor safety-analysis techniques. Final report

    International Nuclear Information System (INIS)

    Barker, D.H.

    1981-12-01

    The work involved on this project took place between March 1, 1975 and December 31, 1981. The work resulted in two PhD and one Masters Theses. Part I was the Verification and Applicability Studies for the VENUS-II LMFBR Disassembly Code. These tests showed that the VENUS-II code closely predicted the energy release in all three tests chosen for analysis. Part II involved the chemical simulation of pool dispersion in the transition phase of an HCDA. Part III involved the reaction of an internally heated fluid and the vessel walls

  1. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  2. Integration of KESS III models in ATHLET-CD and contributions to program verification. Final report

    International Nuclear Information System (INIS)

    Bruder, M.; Schatz, A.

    1994-07-01

    The development of the computer code ATHLET-CD is a contribution to the reactor safety research. ATHLET-CD is an extension of the system code ATHLET by core degradation models especially of the modular software package KESS. The aim of the ATHLET-CD development is the simulation of severe accident sequences from their initialisation to severe core degradation in a continous manner. In the framework of this project the ATHLET-CD development has been focused on the integration of KESS model like the control rod model as well as the models describing chemical interactions and material relocation along a rod and fission product release. The present ATHLET-CD version is able to describe severe accidents in a PWR up to the early core degradation (relocation of material along a rod surface in axial direction). Contributions to the verification of ATHLET-CD comprised calculations of the experiments PHEBUS AIC and PBF SFD 1-4. The PHEBUS AIC calculation was focused on the examination of the control rod model whereas the PBF SFD 1-4 claculation served to check the models describing melting, material relocation and fission product release. (orig.)

  3. Analytical three-dimensional neutron transport benchmarks for verification of nuclear engineering codes. Final report

    International Nuclear Information System (INIS)

    Ganapol, B.D.; Kornreich, D.E.

    1997-01-01

    Because of the requirement of accountability and quality control in the scientific world, a demand for high-quality analytical benchmark calculations has arisen in the neutron transport community. The intent of these benchmarks is to provide a numerical standard to which production neutron transport codes may be compared in order to verify proper operation. The overall investigation as modified in the second year renewal application includes the following three primary tasks. Task 1 on two dimensional neutron transport is divided into (a) single medium searchlight problem (SLP) and (b) two-adjacent half-space SLP. Task 2 on three-dimensional neutron transport covers (a) point source in arbitrary geometry, (b) single medium SLP, and (c) two-adjacent half-space SLP. Task 3 on code verification, includes deterministic and probabilistic codes. The primary aim of the proposed investigation was to provide a suite of comprehensive two- and three-dimensional analytical benchmarks for neutron transport theory applications. This objective has been achieved. The suite of benchmarks in infinite media and the three-dimensional SLP are a relatively comprehensive set of one-group benchmarks for isotropically scattering media. Because of time and resource limitations, the extensions of the benchmarks to include multi-group and anisotropic scattering are not included here. Presently, however, enormous advances in the solution for the planar Green's function in an anisotropically scattering medium have been made and will eventually be implemented in the two- and three-dimensional solutions considered under this grant. Of particular note in this work are the numerical results for the three-dimensional SLP, which have never before been presented. The results presented were made possible only because of the tremendous advances in computing power that have occurred during the past decade

  4. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  5. Minergie-P system verification - Final report; Systemnachweis MINERGIE-Eco - Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Lenel, S.; Ruehle, T.; Schinabeck, J. [Intep - Integrale Planung GmbH, Zuerich(Switzerland); Foradini, F. [E4tech Sarl, Lausanne (Switzerland); Citherlet, S. [Haute Ecole d' Ingenierie et de Gestion du Canton de Vaud HEIG-VD, Yverdon-les-Bains (Switzerland)

    2008-07-01

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) takes a look at the development of methods and software that has made it possible to collect data and evaluate operational energy consumption and the environmental impact connected with the materials used in 'Minergie-ECO' buildings. Such buildings meet the stringent 'Minergie' low energy consumption standards and also use ecologically compatible building materials. The standard is examined and its requirements are discussed, as are the appropriate SIA standards. The methods and tools used in the evaluation are introduced and discussed. Four work packages are defined which cover both energy and well-being/health aspects. Thirteen cases of various types of building are discussed. Also, aspects are noted with respect to refurbishment projects. The report is completed with a comprehensive appendix which, amongst other things, defines the questions posed during the project and the methods used for the evaluation of the results obtained.

  6. Agreements on climatic protection - the verification problem. IKARUS. Instrumente fuer Klimagas-Reduktionsstrategien. Final report. Subproject 9: International Greenhouse Gas Verification

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, W.; Hoffmann, H.J.; Katscher, W.; Kotte, U.; Lauppe, W.D.; Stein, G.

    1995-12-31

    The sustained reduction of climate gas emissions associated with the generation, conversion and utilization of energy is clearly becoming an ever more important task in energy and environmental policy. Different strategies are conceivable in order to fulfil this mission. The aim of the IKARUS Project (Instrumente fuer Klimagas-Reduktionsstrategien - Instruments for Greenhouse Gas Reduction Strategies) was to provide a set of tools with which strategies can be developed and reconstructed (making conceptual use of the various technologies) and also tested with respect to their internal consistency and examined with regard to their economic impacts. Corresponding to the great complexity of the task in hand and the technological range of energy applications, the set of instruments is very extensive. It consists of two parts: a data base with a comprehensive data collection and several computer models of various types.The ``Verification`` project was integrated into IKARUS as a link between the national project and the international environment, enabling the examination of technologies and methods for verifying the compliance of statesparty to the Framwork Convention on Climate Change (FCCC).(orig/KW)

  7. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  8. Viability Study for an Unattended UF6 Cylinder Verification Station: Phase I Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Miller, Karen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garner, James R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Branney, Sean [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McDonald, Benjamin S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zalavadia, Mital A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Todd, Lindsay C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kulisek, Jonathan A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nordquist, Heather [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Deshmukh, Nikhil S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stewart, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-31

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field

  9. APPLICATION OF MULTICRITERIA DECISION MAKING THROUGH FINANCIAL, HUMAN RESOURCES AND BUSINESS PROCESS ASPECT IN VERIFICATION OF COMPANIES’ SUCCESS

    Directory of Open Access Journals (Sweden)

    Ivana Tadić

    2013-02-01

    Full Text Available Striving in volatile and competitive business environment, companies have to reveal the ideal path to survive and provide sustainable success, which can be validated using objective and subjective criteria. In order to fulfil stakeholders’ demands, many companies use different types of non financial indicators, characterising them as subjective ones. Authors lately argue about the usage of subjective criteria and validating them equally as objective ones, approving positive relationship between subjective and objective criteria. The main aim of this paper is to research whether the most successful Croatian companies regarding financial ratios show the similar results by other groups of criteria, as human resource management evaluation and evaluation of the business process success. In order to evaluate success of Croatian public companies, those are ranked by three groups of criteria using Simple Additive Weighting Method (SAW for subjective criteria and PROMETHEE II method for objective criteria. Weighted least square (WLS method was used in order to define weight of each criterion.

  10. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  11. Guidelines for the verification and validation of expert system software and conventional software. Volume 1: Project summary. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (AI) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally

  12. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  13. Initial Application of the FEMP Measurement and Verification Guidelines in Super ESPC Delivery Orders: Final Report; May 2000

    Energy Technology Data Exchange (ETDEWEB)

    Jump, D.; Stetz, M.

    2000-09-05

    Schiller Associates examined the measurement and verification (M and V) plans and activities for seven Western Region Super Energy Savings Performance Contract (ESPC) projects to learn how federal agencies are implementing M and V and what factors influence M and V plan development. This report describes the method used to examine the M and V plans and presents the findings. The goals were to find common factors that influenced M and V plan development and implementation, assess risks to the agency as a result of particular M and V plans, and develop recommendations for improving M and V plan development and implementation. Participating agencies and sites were: (1) National Park Service, Yosemite National Park, CA; (2) Veterans Affairs, VA Medical Center, San Francisco, CA; (3) US Forest Service, USFS Laboratory, Corvallis, OR; (4) Federal Aviation Administration, ATRCC, Auburn, WA; (5) US Department of Defense, Defense Manpower Data Center, Monterey, CA; (6) US Coast Guard, Coast Guard Station, Alameda, CA; and (7) US Navy, Pt. Mugu, Oxnard, CA.

  14. Guidelines for the verification and validation of expert system software and conventional software. Volume 7, User's manual: Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Reliable software is required for nuclear power industry applications. Verification and validation techniques applied during the software development process can help eliminate errors that could inhibit the proper operation of digital systems and cause availability and safety problems. Most of the techniques described in this report are valid for conventional software systems as well as for expert systems. The project resulted in a set of 16 V ampersand V guideline packages and 11 sets of procedures based on the class, development phase, and system component being tested. These guideline packages and procedures help a utility define the level of V ampersand V, which involves evaluating the complexity and type of software component along with the consequences of failure. In all, the project identified 153 V ampersand V techniques for conventional software systems and demonstrated their application to all aspects of expert systems except for the knowledge base, which requires specially developed tools. Each of these conventional techniques covers anywhere from 2-52 total types of conventional software defects, and each defect is covered by 21-50 V ampersand V techniques. The project also identified automated tools to Support V ampersand V activities

  15. Self-Verification Strivings in Children Holding Negative Self-Views: The Mitigating Effects of a Preceding Success Experience.

    Science.gov (United States)

    Reijntjes, Albert; Thomaes, Sander; Kamphuis, Jan Henk; de Castro, Bram Orobio; Telch, Michael J

    2010-12-01

    Research among adults has consistently shown that people holding negative self-views prefer negative over positive feedback. The present study tested the hypothesis that this preference is less robust among pre-adolescents, such that it will be mitigated by a preceding positive event. Pre-adolescents (n = 75) holding positive or negative global self-esteem were randomized to a favorable or unfavorable peer evaluation outcome. Next, preferences for positive versus negative feedback were assessed using an unobtrusive behavioral viewing time measure. As expected, results showed that after being faced with the success outcome children holding negative self-views were as likely as their peers holding positive self-views to display a significant preference for positive feedback. In contrast, children holding negative self-views displayed a stronger preference for negative feedback after being faced with the unfavorable outcome that matched their pre-existing self-views.

  16. Success Avoidant Motivation and Behavior; Its Development Correlates and Situational Determinants. Final Report.

    Science.gov (United States)

    Horner, Matina S.

    This paper reports on a successful attempt to understand success avoidant motivation and behavior by the development of an empirically sophisticated scoring system of success avoidant motivation and the observation of its behavioral correlates and situational determinants. Like most of the work on achievement motivation, the study was carried out…

  17. FINAL INTERIM REPORT VERIFICATION SURVEY ACTIVITIES IN FINAL STATUS SURVEY UNITS 7, 8, 9, 10, 11, 13 and 14 AT THE SEPARATIONS PROCESS RESEARCH UNIT, NISKAYUNA, NEW YORK

    International Nuclear Information System (INIS)

    Jadick, M.G.

    2010-01-01

    The Separations Process Research Unit (SPRU) facilities were constructed in the late 1940s to research the chemical separation of plutonium and uranium. SPRU operated between February 1950 and October 1953. The research activities ceased following the successful development of the reduction/oxidation and plutonium/uranium extraction processes that were subsequently used by the Hanford and the Savannah River sites.

  18. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  19. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  20. Early Tracking or Finally Leaving? Determinants of Early Study Success in First-Year University Students

    Science.gov (United States)

    Brouwer, Jasperina; Jansen, Ellen; Hofman, Adriaan; Flache, Andreas

    2016-01-01

    Two theoretical approaches underlie this investigation of the determinants of early study success among first-year university students. Specifically, to extend Walberg's educational productivity model, this study draws on the expectancy-value theory of achievement motivation in a contemporary university context. The survey data came from 407…

  1. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  2. Improving regulatory effectiveness in Federal/State siting actions. Success factor evaluation panel. Final report

    International Nuclear Information System (INIS)

    Haggard, J.

    1977-06-01

    An independent appraisal of the factors that determine efficiency in reaching environmental decisions with respect to nuclear facilities was addressed. The Panel recommended to substitute 'effectiveness' for 'efficiency.' Thus, an effective decision is: 'A timely final decision, that provides for necessary change, consistent with societal objectives and law, and which is equitable and practical, and is based upon fully and candidly expressed premises utilizing a commonly available data base.' The measurement criteria for evaluating the effectiveness of the environmental decision making process are: timely decision, final decision, provision for change, consistency with societal goals and law, equitable, practical, fully and candidly expressed premises, commonly available data base, and public confidence. The Panel evaluated the 8 policies proposed by NRC staff as essential to licensing reform: national fuels policy, regional review, early disclosure, State role, technical assistance to State, role of utilities, radiation health and safety, and modification of the Atomic Energy Act. The five NRC scenarios were evaluated in terms of regulatory effectiveness

  3. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  4. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  5. Information literacy: are final-year medical radiation science students on the pathway to success?

    Science.gov (United States)

    Thompson, Nadine; Lewis, Sarah; Brennan, Patrick; Robinson, John

    2010-01-01

    It is necessary for Medical Radiation Science (MRS) students to become information literate in order to interact with and thrive in the professional health care arena. All health care professionals require information literacy (IL) skills to be independent learners and critical thinkers. To achieve this, effective search and evaluation methods must be cultivated in students. Twenty-eight final year MRS students participated in a 30-minute digitally recorded interview regarding their knowledge of information sources, where they locate information, and how they evaluate these sources. Constant comparative analysis via grounded theory was used to thematise the data. A conceptual framework was developed demonstrating the link between the key concepts of convenience, confidence and competence. The impact of the internet on the IL skills of students has been profound, due mainly to convenience. Most students had little confidence in their IL skills, however there were still some students who were confident with their skills and were competent who still preferred to access information sources that were convenient because there was nothing preventing them from doing so. By identifying problem areas, educators can redesign curricula around the strengths and weaknesses of students' IL skills, thus promoting lifelong learning and using electronic based learning to its full potential.

  6. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.

    2017-03-01

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  7. Guidelines for the verification and validation of expert system software and conventional software: Volume 2, Survey and assessment of conventional software verification and validation methods Revision 1, Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.H.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit Metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  8. FY2017 Final Report: Power of the People: A technical ethical and experimental examination of the use of crowdsourcing to support international nuclear safeguards verification.

    Energy Technology Data Exchange (ETDEWEB)

    Gastelum, Zoe Nellie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sentz, Kari [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Swanson, Meili Claire [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rinaudo, Cristina [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-01

    Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclear safeguards verification.

  9. Development and verification of a three-dimensional core model for WWR type reactors and its coupling with the accident code ATHLET. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Lucas, D.; Mittag, S.; Rohde, U.

    1995-04-01

    The main goal of the project was the coupling of the 3D core model DYN3D for Russian VVER-type reactors, which has been developed in the RCR, with the thermohydraulic code ATHLET. The coupling has been realized on two basically different ways: - The implementation of only the neutron kinetics model of DYN3D into ATHLET (internal coupling), - the connection of the complete DYN3D core model including neutron kinetics, thermohydraulics and fuel rod model via data interfaces at the core top and bottom (external coupling). For the test of the coupling, comparative calculations between internal and external coupling versions have been carried out for a LOCA and a reactivity transient. Complementary goals of the project were: - The development of a DYN3D version for burn-up calculations, - the verification of DYN3D on benchmark tasks and experimental data on fuel rod behaviour, - a study on the extension of the neutron-physical data base. The project contributed to the development of advanced tools for the safety analysis of VVER-type reactors. Future work is aimed to the verification of the coupled code complex DYN3D-ATHLET. (orig.) [de

  10. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  11. Verification of analysis methods for predicting the behaviour of seismically isolated nuclear structures. Final report of a co-ordinated research project 1996-1999

    International Nuclear Information System (INIS)

    2002-06-01

    This report is a summary of the work performed under a co-ordinated research project (CRP) entitled Verification of Analysis Methods for Predicting the Behaviour of Seismically isolated Nuclear Structures. The project was organized by the IAEA on the recommendation of the IAEA's Technical Working Group on Fast Reactors (TWGFR) and carried out from 1996 to 1999. One of the primary requirements for nuclear power plants and facilities is to ensure safety and the absence of damage under strong external dynamic loading from, for example, earthquakes. The designs of liquid metal cooled fast reactors (LMFRs) include systems which operate at low pressure and include components which are thin-walled and flexible. These systems and components could be considerably affected by earthquakes in seismic zones. Therefore, the IAEA through its advanced reactor technology development programme supports the activities of Member States to apply seismic isolation technology to LMFRs. The application of this technology to LMFRs and other nuclear plants and related facilities would offer the advantage that standard designs may be safely used in areas with a seismic risk. The technology may also provide a means of seismically upgrading nuclear facilities. Design analyses applied to such critical structures need to be firmly established, and the CRP provided a valuable tool in assessing their reliability. Ten organizations from India, Italy, Japan, the Republic of Korea, the Russian Federation, the United Kingdom, the United States of America and the European Commission co-operated in this CRP. This report documents the CRP activities, provides the main results and recommendations and includes the work carried out by the research groups at the participating institutes within the CRP on verification of their analysis methods for predicting the behaviour of seismically isolated nuclear structures

  12. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  13. The Students Upgrading through Computer and Career Education Systems Services (Project SUCCESS). 1990-91 Final Evaluation Profile. OREA Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.

    An evaluation was done of the New York City Public Schools' Student Upgrading through Computer and Career Education Systems Services Program (Project SUCCESS). Project SUCCESS operated at 3 high schools in Brooklyn and Manhattan (Murry Bergtraum High School, Edward R. Murrow High School, and John Dewey High School). It enrolled limited English…

  14. Mechanical Thrombectomy using a solitaire stent in acute ischemic stroke; The relationship between the visible antegrade flow on first device deployment and final success in revascularization

    International Nuclear Information System (INIS)

    Lee, Sung Ho; Lee, Byung Hon; Hwang, Yoon Joon; Kim, Su Young; Lee, Ji Young; Hong, Keun Sik; Cho, Yong Jin

    2015-01-01

    The purpose of the study was to evaluate the relationship between the successful revascularization on the first Solitaire stent deployment and the successful revascularization on the final angiography in acute ischemic stroke. From February 2012 to April 2014, 24 patients who underwent Solitaire stent thrombectomy as the first thrombectomy method for treatment of acute ischemic strokes were retrospectively reviewed. When the first Solitaire stent was deployed, 9 patients showed revascularization (Group 1) and 15 patients did not show revascularization (Group 2). Revascularization immediately after the first Solitaire stent removal and on the final angiography were comparatively assessed between the 2 groups. Statistical analysis was performed by the Fisher exact test and Student's t-test. The rates of revascularization maintenance immediately after the first Solitaire stent removal were 89% in Group 1 and 27% in Group 2, respectively (p = 0.009), and the rates of final successful revascularization were 100% in Group 1 and 47% in Group 2, respectively (p = 0.009). There was a statistically significant difference between the 2 groups. Revascularization on the first Solitaire stent deployment can be a useful predictor in evaluating the success of final revascularization in the treatment of acute ischemic stroke.

  15. Programs of Study as a State Policy Mandate: A Longitudinal Study of the South Carolina Personal Pathways to Success Initiative. Final Technical Report: Major Findings and Implications

    Science.gov (United States)

    Hammond, Cathy; Drew, Sam F.; Withington, Cairen; Griffith, Cathy; Swiger, Caroline M.; Mobley, Catherine; Sharp, Julia L.; Stringfield, Samuel C.; Stipanovic, Natalie; Daugherty, Lindsay

    2013-01-01

    This is the final technical report from the National Research Center for Career and Technical Education's (NRCCTE's) five-year longitudinal study of South Carolina's Personal Pathway to Success initiative, which was authorized by the state's Education and Economic Development Act (EEDA) in 2005. NRCCTE-affiliated researchers at the National…

  16. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  17. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  18. Uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. Final report, February 16, 1990--December 31, 1994

    International Nuclear Information System (INIS)

    Busch, R.D.

    1995-01-01

    Dr. Robert Busch of the Department of Chemical and Nuclear Engineering was the principal investigator on this project with technical direction provided by the staff in the Nuclear Criticality Safety Group at Los Alamos. During the period of the contract, he had a number of graduate and undergraduate students working on subtasks. The objective of this work was to develop information on uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. During the first year of this project, most of the work was focused on setting up the SUN SPARC-1 Workstation and acquiring the literature which described the critical experiments. By august 1990, the Workstation was operational with the current version of TWODANT loaded on the system. MCNP, version 4 tape was made available from Los Alamos late in 1990. Various documents were acquired which provide the initial descriptions of the critical experiments under consideration as benchmarks. The next four years were spent working on various benchmark projects. A number of publications and presentations were made on this material. These are briefly discussed in this report

  19. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  20. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1993-94. OER Report.

    Science.gov (United States)

    Greene, Judy

    Students Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation. The project operated at two high schools in Brooklyn and one in Manhattan (New York). In the 1993-94 school year, the project served 393 students of…

  1. From Wireless Sensor Networks to Wireless Body Area Networks: Formal Modeling and Verification on Security Using PAT

    Directory of Open Access Journals (Sweden)

    Tieming Chen

    2016-01-01

    Full Text Available Model checking has successfully been applied on verification of security protocols, but the modeling process is always tedious and proficient knowledge of formal method is also needed although the final verification could be automatic depending on specific tools. At the same time, due to the appearance of novel kind of networks, such as wireless sensor networks (WSN and wireless body area networks (WBAN, formal modeling and verification for these domain-specific systems are quite challenging. In this paper, a specific and novel formal modeling and verification method is proposed and implemented using an expandable tool called PAT to do WSN-specific security verification. At first, an abstract modeling data structure for CSP#, which is built in PAT, is developed to support the node mobility related specification for modeling location-based node activity. Then, the traditional Dolev-Yao model is redefined to facilitate modeling of location-specific attack behaviors on security mechanism. A throughout formal verification application on a location-based security protocol in WSN is described in detail to show the usability and effectiveness of the proposed methodology. Furthermore, also a novel location-based authentication security protocol in WBAN can be successfully modeled and verified directly using our method, which is, to the best of our knowledge, the first effort on employing model checking for automatic analysis of authentication protocol for WBAN.

  2. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York DCN: 5098-SR-06-0

    International Nuclear Information System (INIS)

    Harpeneau, Evan

    2011-01-01

    The Separations Process Research Unit (SPRU) complex located on the Knolls Atomic Power Laboratory (KAPL) site in Niskayuna, New York, was constructed in the late 1940s to research the chemical separation of plutonium and uranium (Figure A-1). SPRU operated as a laboratory scale research facility between February 1950 and October 1953. The research activities ceased following the successful development of the reduction oxidation and plutonium/uranium extraction processes. The oxidation and extraction processes were subsequently developed for large scale use by the Hanford and Savannah River sites (aRc 2008a). Decommissioning of the SPRU facilities began in October 1953 and continued through the 1990s.

  3. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Evan Harpeneau

    2011-06-24

    The Separations Process Research Unit (SPRU) complex located on the Knolls Atomic Power Laboratory (KAPL) site in Niskayuna, New York, was constructed in the late 1940s to research the chemical separation of plutonium and uranium (Figure A-1). SPRU operated as a laboratory scale research facility between February 1950 and October 1953. The research activities ceased following the successful development of the reduction oxidation and plutonium/uranium extraction processes. The oxidation and extraction processes were subsequently developed for large scale use by the Hanford and Savannah River sites (aRc 2008a). Decommissioning of the SPRU facilities began in October 1953 and continued through the 1990s.

  4. Design and verification of distributed logic controllers with application of Petri nets

    Energy Technology Data Exchange (ETDEWEB)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika [University of Zielona Góra, Licealna 9, 65-417 Zielona Góra (Poland)

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  5. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  6. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  7. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  8. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  9. Effect of the Operation of Kerr and Hungry Horse Dams on the Reproductive Success of Kokanee in the Flathead System; Technical Addendum to the Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Will; Tohtz, Joel

    1990-03-01

    This addendum to the Final Report presents results of research on the zooplankton and fish communities of Flathead Lade. The intent of the Study has been to identify the impacts of hydroelectric operations at Kerr and Hungry Horse Dam on the reproductive success of kokanee an to propose mitigation for these impacts. Recent changes in the trophic ecology of the lake, have reduced the survival of kokanee. In the last three year the Study has been redirected to identify, if possible, the biological mechanisms which now limit kokanee survival, and to test methods of enhancing the kokanee fishery by artificial supplementation. These studies were necessary to the formulation of mitigation plans. The possibility of successfully rehabilitating the kokanee population, is the doubt because of change in the trophic ecology of the system. This report first presents the results of studies of the population dynamics of crustacean zooplankton, upon which planktivorous fish depend. A modest effort was directed to measuring the spawning escapement of kokanee in 1988. Because of its relevance to the study, we also report assessments of 1989 kokanee spawning escapement. Hydroacoustic assessment of the abundance of all fish species in Flathead Lake was conducted in November, 1988. Summary of the continued efforts to document the growth rates and food habits of kokanee and lake whitefish are included in this report. Revised kokanee spawning and harvest estimates, and management implications of the altered ecology of Flathead Lake comprise the final sections of this addendum. 83 refs., 20 figs., 25 tabs.

  10. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  11. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  12. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  15. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  16. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  17. Implementation Practices of Finland in Facilitating IAEA Verification Activities

    International Nuclear Information System (INIS)

    Martikka, E.; Ansaranta, T.; Honkamaa, T.; Hamalainen, M.

    2015-01-01

    The Member States provide the information to the IAEA according to the Safeguards Agreements and Additional Protocols. For example, the requirements to provide the reports and declarations are very general and there are no explanation what the IAEA is looking for from that information. It is important for the States to understand how their efforts to collect and provide information, and to facilitate IAEA verification activities, contribute to the achievement of objectives and finally to draw conclusions on the exclusively peaceful use of nuclear materials in a State. The IAEA is producing a new series of guidance called Safeguards Implementation Practices, SIP, guides, which are shedding light on the requirements and sharing the good practices of States. It is hoped that the SIP Guides will create a better understanding of the needs of the IAEA and the important role of States and facility operators in achieving safeguards objectives. The guides are also important for the States to share their lessons learned and good practices for the benefit of other States that might be developing their capabilities or enhancing their processes and procedures. The way is very wide and long, when a State decides to start up a new nuclear programme. At first there is a need for legislation, regulatory body, contact point, international agreements and then finally practical implementation of the safeguards in the nuclear facilities. There are a lot of issues to be prepared in advance to facilitate the IAEA's implementation of verification activities successfully, effectively and with the good quality. Using the structure of the IAEA's draft SIP Guide on Facilitating Verification Activities as a framework, this paper will describe the most relevant implementation practices and experiences in Finland. (author)

  18. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    Science.gov (United States)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  19. INF and IAEA: A comparative analysis of verification strategy

    International Nuclear Information System (INIS)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  20. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  1. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  2. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  3. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  4. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  5. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  6. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  7. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  8. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  9. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  10. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  11. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  12. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  13. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  16. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  17. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  18. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  19. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  20. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  1. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  2. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  3. EVALUATING THE SUCCESS OF PEACE OPERATIONS

    African Journals Online (AJOL)

    Garb, Maja

    Peacekeepers are lightly armed and do not fire except in self-defence; .... communication theories, he assumes that the success of peace support operations is ..... UNAVEM III – United Nations Angola Verification Mission III; UNIFIL –.

  4. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  5. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  6. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  7. Final Report Integrated DM1200 Melter Testing Using AZ-102 And C-106/AY-102 HLW Simulants: HLW Simulant Verification VSL-05R5800-1, Rev. 0, 6/27/05

    International Nuclear Information System (INIS)

    Kruger, A.A.; Matlack, K.S.; Gong, W.; Bardakci, T.; D'Angelo, N.A.; Brandys, M.; Kot, W.K.; Pegg, I.L.

    2011-01-01

    The principal objectives of the DM1200 melter tests were to determine the effects of feed rheology, feed solid content, and bubbler configuration on glass production rate and off-gas system performance while processing the HLW AZ-101 and C-106/AY-102 feed compositions; characterize melter off-gas emissions; characterize the performance of the prototypical off-gas system components, as well as their integrated performance; characterize the feed, glass product, and off-gas effluents; and perform pre- and post test inspections of system components. The specific objectives (including test success criteria) of this testing, along with how each objective was met, are outlined in a table. The data provided in this Final Report address the impacts of HLW melter feed rheology on melter throughput and validation of the simulated HLW melter feeds. The primary purpose of this testing is to further validate/verify the HLW melter simulants that have been used for previous melter testing and to support their continued use in developing melter and off-gas related processing information for the Project. The primary simulant property in question is rheology. Simulants and melter feeds used in all previous melter tests were produced by direct addition of chemicals; these feed tend to be less viscous than rheological the upper-bound feeds made from actual wastes. Data provided here compare melter processing for the melter feed used in all previous DM100 and DM1200 tests (nominal melter feed) with feed adjusted by the feed vendor (NOAH Technologies) to be more viscous, thereby simulating more closely the upperbounding feed produced from actual waste. This report provides results of tests that are described in the Test Plan for this work. The Test Plan is responsive to one of several test objectives covered in the WTP Test Specification for this work; consequently, only part of the scope described in the Test Specification was addressed in this particular Test Plan. For the purpose of

  8. FINAL REPORT INTEGRATED DM1200 MELTER TESTING USING AZ 102 AND C 106/AY-102 HLW SIMULANTS: HLW SIMULANT VERIFICATION VSL-05R5800-1 REV 0 6/27/05

    Energy Technology Data Exchange (ETDEWEB)

    KRUGER AA; MATLACK KS; GONG W; BARDAKCI T; D' ANGELO NA; BRANDYS M; KOT WK; PEGG IL

    2011-12-29

    The principal objectives of the DM1200 melter tests were to determine the effects of feed rheology, feed solid content, and bubbler configuration on glass production rate and off-gas system performance while processing the HLW AZ-101 and C-106/AY-102 feed compositions; characterize melter off-gas emissions; characterize the performance of the prototypical off-gas system components, as well as their integrated performance; characterize the feed, glass product, and off-gas effluents; and perform pre- and post test inspections of system components. The specific objectives (including test success criteria) of this testing, along with how each objective was met, are outlined in a table. The data provided in this Final Report address the impacts of HLW melter feed rheology on melter throughput and validation of the simulated HLW melter feeds. The primary purpose of this testing is to further validate/verify the HLW melter simulants that have been used for previous melter testing and to support their continued use in developing melter and off-gas related processing information for the Project. The primary simulant property in question is rheology. Simulants and melter feeds used in all previous melter tests were produced by direct addition of chemicals; these feed tend to be less viscous than rheological the upper-bound feeds made from actual wastes. Data provided here compare melter processing for the melter feed used in all previous DM100 and DM1200 tests (nominal melter feed) with feed adjusted by the feed vendor (NOAH Technologies) to be more viscous, thereby simulating more closely the upperbounding feed produced from actual waste. This report provides results of tests that are described in the Test Plan for this work. The Test Plan is responsive to one of several test objectives covered in the WTP Test Specification for this work; consequently, only part of the scope described in the Test Specification was addressed in this particular Test Plan. For the purpose of

  9. Time to Act: An Agenda for Advancing Adolescent Literacy for College and Career Success. Final Report from Carnegie Corporation of New York's Council on Advancing Adolescent Literacy

    Science.gov (United States)

    Carnegie Corporation of New York, 2011

    2011-01-01

    Our nation's educational system has scored many extraordinary successes in raising the level of reading and writing skills in younger children. Yet the pace of literacy improvement in our schools has not kept up with the accelerating demands of the global knowledge economy. In state after state, the testing data mandated by No Child Left Behind…

  10. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  11. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  12. BWR SFAT, gross-defect verification of spent BWR fuel. Final report on Task FIN A563 on the Finnish Support Programme to IAEA Safeguards including BWR SFAT User Manual

    International Nuclear Information System (INIS)

    Tarvainen, M.; Paakkunainen, M.; Tiitta, A.; Sarparanta, K.

    1994-04-01

    A measurement instrument called Spent Fuel Attribute Tester, SFAT, has been designed, fabricated and taken into use by the IAEA in gross defect verification of spent BWR fuel assemblies. The equipment consists of an underwater measurement head connected with cables to a control unit on the bridge of the fuel handling machine as well as to a PMCA for measurement of the gamma spectra. The BWR SFAT is optimized for the AFR interim storage, TVO KPA-STORE, of the TVO Power Company in Olkiluoto, Finland. It has a shape and it is moved like a fuel assembly using the fuel handling machine. No fuel movements are needed. Spent fuel specific radiation from the fission product 137 Cs at the gamma-ray energy of 662 keV is detected above the assemblies in the storage rack using a NaI(Tl) detector. In the design and in licensing the requirements of the IAEA, operator and the safety authority have been taken into account. The BWR SFAT allows modifications for other LWR fuel types with minor changes. The work has been carried out under the task FIN A 563 of the Finnish Support Programme to IAEA Safeguards. (orig.) (9 refs., 22 figs.)

  13. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  14. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  15. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  17. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  18. Helios1A EoL: A Success. For the first Time a Long Final Thrust Scenario, Respecting the French Law on Space Operations

    Science.gov (United States)

    Guerry, Agnes; Moussi, Aurelie; Sartine, Christian; Beaumet, Gregory

    2013-09-01

    HELIOS1A End Of Live (EOL) operations occurred in the early 2012. Through this EOL operation, CNES wanted to make an example of French Space Act compliance. Because the satellite wasn't natively designed for such an EOL phase, the operation was touchy and risky. It was organized as a real full project in order to assess every scenario details with dedicated Mission Analysis, to secure the operations through detailed risk analysis at system level and to consider the major failures that could occur during the EOL. A short scenario allowing to reach several objectives with benefits was eventually selected. The main objective of this project was to preserve space environment. The operations were led on a "best effort" basis. The French Space Operations Act (FSOA) requirements were met: HELIOS-1A EOL operations had been led successfully.

  19. Transparencies used in describing the CTBT verification regime and its four monitoring technologies

    International Nuclear Information System (INIS)

    Basham, P.

    1999-01-01

    This presentation includes description of the CTBT verification regime and its four monitoring technologies, (namely, seismic monitoring, hydro acoustic monitoring, infrasound monitoring and radionuclides monitoring) CTBT global verification system, sequence of steps needed for installing an international monitoring system station which includes: site survey, site preparation and construction, equipment procurement and installation, final tests and certification

  20. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  1. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  2. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  3. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  4. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  5. Effect of the Operation of Kerr and Hungry Horse Dams on the Reproductive Success of Kokanee in the Flathead System, 1987 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Will; Zubik, Raymond; Clancey, Patrick

    1988-05-01

    Studies of kokanee reproductive success in the Flathead system from 1981 to 1987 have assessed the losses in fish production attributable to hydroelectric operations. We estimated that the Flathead Lake shoreline spawning stock has lost at least 50,000 fish annually, since Kerr Dam was completed in 1938. The Flathead River spawning stock has lost 95,000 spawners annually because of the operations of Hungry Horse Dam. Lakeshore spawning has been adversely affected because Flathead Lake has been drafted to minimum pool during the winter when kokanee eggs are incubating in shallow shoreline redds. Egg mortality from exposure and desiccation of kokanee redds has increased since the mid 1970's. When the lake was drafted more quickly and held longer at minimum pool. Escapement surveys in the early 1950's, and a creel survey in the early 1960's have provided a baseline to which the present escapement levels can be compared, and loss estimated. Main stem Flathead River spawning has also declined since the mid 1970's when fluctuating discharge from Hungry Horse Dam during the spawning and incubation season exposed redds at the river margin and increased mortality. This decline followed an increase in main stem spawning in the late 1950's through the mid 1960's attributable to higher winter water temperature and relatively stable discharge from Hungry Horse Dam. Spawning escapement in the main stem exceeded 300,000 kokanee in the early 1970's as a result. Spawning in spring-influenced sites has comprised 35 percent of the main stem escapement from 1979 to 1986. We took that proportion of the early 1970's escapement (105,000) as the baseline against which to measure historic loss. Agricultural and suburban development has contributed less significantly to degradation of kokanee spawning habitat in the river system and on the Flathead Lake shoreline. Their influence on groundwater quality and substrate composition has limited

  6. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  7. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  8. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  9. Hood River Steelhead Genetics Study; Relative Reproductive Success of Hatchery and Wild Steelhead in the Hood River, Final Report 2002-2003.

    Energy Technology Data Exchange (ETDEWEB)

    Blouin, Michael

    2003-05-01

    microsatellite-based pedigree analysis, the relative total reproductive success (adult-to-adult production) of hatchery (H{sub old} or H{sub new}) and wild (W) fish for two populations, over multiple brood years. Our analyses of samples from fish that bred in the early to mid 1990's show that fish of 'old' hatchery stocks have much lower total fitness than wild fish (17% to 54% of wild fitness), but that 'new' stocks have fitness that is similar to that of wild fish (ranging from 85% to 108% of wild fitness, depending on parental gender and run year). Therefore, our results show that the decision to phase out the old, out-of-basin stocks and replace them with new, conservation hatchery stocks was well founded. We also conclude that the H{sub new} fish are leaving behind substantial numbers of wild-born offspring. The similar fitnesses of H{sub new} and W fish suggests that wild-born offspring of H{sub new} fish are unlikely to have negative genetic effects on the population when they in turn spawn in the wild. We will test this hypothesis once enough F2 offspring have returned. Another interesting result is that we were unable to match a large fraction of the unclipped, returning fish with parents from their brood year. Furthermore, we were missing more fathers than mothers. Because we sampled almost every possible anadromous parent, these results suggest that nonanadromous trout or precocious parr may be obtaining a substantial number of matings. Substantial reproduction by precocious parr could be one unintended consequence of the hatchery program.

  10. Customized Nudging to Improve FAFSA Completion and Income Verification

    Science.gov (United States)

    Page, Lindsay; Castleman, Benjamin L.

    2016-01-01

    For most students from low- or moderate-income families, successfully completing the Free Application for Federal Student Aid (FAFSA) is a crucial gateway on the path to college access. However, FAFSA filing and income verification tasks pose substantial barriers to college access for low-income students. In this paper, the authors report on a…

  11. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  12. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  13. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  14. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  15. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  16. Partner verification: restoring shattered images of our intimates.

    Science.gov (United States)

    De La Ronde, C; Swann, W B

    1998-08-01

    When spouses received feedback that disconfirmed their impressions of their partners, they attempted to undermine that feedback during subsequent interactions with these partners. Such partner verification activities occurred whether partners construed the feedback as overly favorable or overly unfavorable. Furthermore, because spouses tended to see their partners as their partners saw themselves, their efforts to restore their impressions of partners often worked hand-in-hand with partners' efforts to verify their own views. Finally, support for self-verification theory emerged in that participants were more intimate with spouses who verified their self-views, whether their self-views happened to be positive or negative.

  17. Successful verification of subcontracted work in the construction industry

    NARCIS (Netherlands)

    Makkinga, Rick; de Graaf, Robin; Voordijk, Hans

    2018-01-01

    Due to the introduction of new types of contracts, such as Design, Build, Finance, & Maintain (DBFM), a major shift in tasks and responsibilities from client to contractor can be seen in the construction industry. To manage these new contracts and corresponding shifts in responsibilities, systems

  18. Fuzzy Verification of Lower Dimensional Information in a Numerical Simulation of Sea Ice

    Science.gov (United States)

    Sulsky, D.; Levy, G.

    2010-12-01

    Ideally, a verification and validation scheme should be able to evaluate and incorporate lower dimensional features (e.g., discontinuities) contained within a bulk simulation even when not directly observed or represented by model variables. Nonetheless, lower dimensional features are often ignored. Conversely, models that resolve such features and the associated physics well, yet imprecisely are penalized by traditional validation schemes. This can lead to (perceived or real) poor model performance and predictability and can become deleterious in model improvements when observations are sparse, fuzzy, or irregular. We present novel algorithms and a general framework for using information from available satellite data through fuzzy verification that efficiently and effectively remedy the known problems mentioned above. As a proof of concept, we use a sea-ice model with remotely sensed observations of leads in a one-step initialization cycle. Using the new scheme in a sixteen day simulation experiment introduces model skill (against persistence) several days earlier than in the control run, improves the overall model skill and delays its drop off at later stages of the simulation. Although sea-ice models are currently a weak link in climate models, the appropriate choice of data to use, and the fuzzy verification and evaluation of a system’s skill in reproducing lower dimensional features are important beyond the initial application to sea ice. Our strategy and framework for fuzzy verification, selective use of information, and feature extraction could be extended globally and to other disciplines. It can be incorporated in and complement existing verification and validation schemes, increasing their computational efficiency and the information they use. It can be used for model development and improvements, upscaling/downscaling models, and for modeling processes not directly represented by model variables or direct observations. Finally, if successful, it can

  19. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  20. Specification and Verification of Web Applications in Rewriting Logic

    Science.gov (United States)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  1. Expose : procedure and results of the joint experiment verification tests

    Science.gov (United States)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  2. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  3. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  4. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  5. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  6. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  7. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  8. Verification of the computer code ATHLET in the framework of the external verification group ATHLET BETHSY test 9.3 - steam generator U-tube rupture with failure of the high pressure injection. Final report; Verifikation des ATHLET-Rechenprogramms im Rahmen der externen Verifikationsgruppe ATHLET BETHSY Test 9.3 - Heizrohrbruch mit Versagen der Hochdruck-Noteinspeisung. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Krepper, E.; Schaefer, F. [Forschungszentrum Rossendorf e.V. (FZR) (Germany). Inst. fuer Sicherheitsforschung

    1998-08-01

    In the framework of the external validation of the thermalhydraulic code ATHLET MOD 1.1 CYCLE D, which is being developed by the GRS, post test analyses of two experiments were done, which were performed at the French integral test facility BETHSY. During test 9.3 the consequences of a steam generator U-tube rupture with failure of the high pressure injection and of the auxiliary feedwater supply were investigated. As accident management measures, the depressurization of the secondary sides, first of the two intact steam generators, then of the damaged steam generator and finally the primary depressurization by opening of the pressurizer valve were performed. The results show, that the code ATHLET is able to describe the complex scenario in good accordance with the experiment. The safety relevant statement could be reproduced. Deviations, which did not impose the general results, occurred concerning the break mass flow during the depressurization of the damaged steam generator and the description of the failure of the heat transfer to the damaged steam generator. Reasons are hardly to find, because these processes are highly complex. (orig.) [Deutsch] Im Rahmen der externen Validierung des von der Gesellschaft fuer Anlagen- und Reaktorsicherheit entwickelten Stoerfallcodes ATHLET, der in der Version Mod 1.1 Cycle D vorlag, wurden zwei Experimente nachgerechnet und analysiert, die an der franzoesischen Versuchsanlage BETHSY durchgefuehrt wurden. Im Test 9.3 werden die Konsequenzen untersucht, wenn bei einem Heizrohrbruch die Hochdruckeinspeisung sowie die Not-Speisewasserversorgung der Dampferzeuger versagen und nur die Druckspeicher sowie die Niederdruckeinspeisung zur Verfuegung stehen. Als Accident Management Massnahmen wurde die sekundaere Druckentlastung und schliesslich die primaere Entlastung ueber den Druckhalter untersucht. Die Analyse kommt zu dem Ergebnis, dass der Code ATHLET in der Lage ist, dieses komplexe Szenario recht gut zu beschreiben. Die

  9. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    Full text: External beam radiation therapy delivery began around the turn of the century with the use of one or a few kilovoltage beams directed to the presumed site of the tumor. Often the treatment lasted until erythema dose was reached. Delivering the beams rotationally allowed the dose to be focused on the tumor and the skin to be spared. With the advent of megavoltage radiation therapy in the 1950's, using Co-60 teletherapy and betatrons, the treatment could once again be delivered from only a few beam directions and the dose to the skin would be kept below tolerance. Fields were shaped by lead blocks and later by custom-made blocks fabricated from low-melting temperature heavy metal. Linear accelerators did not fundamentally change the way in which radiation was delivered. It is likely that this delivery paradigm would not have changed had it not been for the advent of computers. Brahme and Cormack showed in the late 1980's that highly conformal treatments could be delivered with non-uniform intensity beams. At that time the only way in which the intensity modulated beams could be delivered was using custom-milled compensators. Fabricating and using compensators for multiple fields is time-consuming and labor-intensive. Serial tomotherapy was the first successful delivery method for IMRT and went back to the earlier practice of rotation therapy. The NOMOS Peacock system uses a binary (on-off) multileaf collimator (MLC) system to modulate a fan beam of radiation. It uses an optimization system to determine when leaves should be opened and closed. The system delivers two beam slices at once and the couch is indexed to the next slices by precisely translating the couch. This approach was first used in 1994 and to-date has treated several thousand patients. Prior to the advent of IMRT, accelerator vendors introduced the multileaf collimator (MLC) to provide field shaping without the need to fabricate custom blocking. Most new linear accelerator purchases today

  10. TET-1- A German Microsatellite for Technology On -Orbit Verification

    Science.gov (United States)

    Föckersperger, S.; Lattner, K.; Kaiser, C.; Eckert, S.; Bärwald, W.; Ritzmann, S.; Mühlbauer, P.; Turk, M.; Willemsen, P.

    2008-08-01

    significant confusion in the space industry today over the terms used to describe satellite bus architectures. Terms such as "standard bus" (or "common bus"), "modular bus" and "plug-and-play bus" are often used with little understanding of what the terms actually mean, and even less understanding of what the differences in these space architectures mean. It may seem that these terms are subtle differentiators, but in reality these terms describe radically different ways to design, build, test, and operate satellites. Furthermore, these terms imply very different business models for the acquisition, operation, and sustainment of space systems. This paper will define and describe the difference between "standard buses", "modular buses" and "plug-and-play buses"; giving examples of each kind with a cost/benefit discussion of each type. under Kayser-Threde responsibility provides the necessary interfaces to the experiments. The first TET mission is scheduled for mid of 2010. TET will be launched as piggy-back payload on any available launcher worldwide to reduce launch cost and provide maximum flexibility. Finally, TET will provide all services required by the experimenters for a one year mission operation to perform a successful OOV-mission with its technology experiments leading to an efficient access to space for German industry and institutions.

  11. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  12. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  13. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  14. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  15. DESIGN INFORMATION VERIFICATION FOR NUCLEAR SAFEGUARDS

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Bean; Richard R. M. Metcalf; Phillip C. Durst

    2009-07-01

    A critical aspect of international safeguards activities performed by the International Atomic Energy Agency (IAEA) is the verification that facility design and construction (including upgrades and modifications) do not create opportunities for nuclear proliferation. These Design Information Verification activities require that IAEA inspectors compare current and past information about the facility to verify the operator’s declaration of proper use. The actual practice of DIV presents challenges to the inspectors due to the large amount of data generated, concerns about sensitive or proprietary data, the overall complexity of the facility, and the effort required to extract just the safeguards relevant information. Planned and anticipated facilities will (especially in the case of reprocessing plants) be ever larger and increasingly complex, thus exacerbating the challenges. This paper reports the results of a workshop held at the Idaho National Laboratory in March 2009, which considered technologies and methods to address these challenges. The use of 3D Laser Range Finding, Outdoor Visualization System, Gamma-LIDAR, and virtual facility modeling, as well as methods to handle the facility data issues (quantity, sensitivity, and accessibility and portability for the inspector) were presented. The workshop attendees drew conclusions about the use of these techniques with respect to successfully employing them in an operating environment, using a Fuel Conditioning Facility walk-through as a baseline for discussion.

  16. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  17. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  18. TFE design package final report, TFE Verification Program

    International Nuclear Information System (INIS)

    1994-06-01

    The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. A TFE for a megawatt class system is described. Only six cells are considered for simplicity; a megawatt class TFE would have many more cells, the exact number dependent on optimization trade studies

  19. Sheath insulator final test report, TFE Verification Program

    International Nuclear Information System (INIS)

    1994-07-01

    The sheath insulator in a thermionic cell has two functions. First, the sheath insulator must electrically isolate the collector form the outer containment sheath tube that is in contact with the reactor liquid metal coolant. Second, The sheath insulator must provide for high uniform thermal conductance between the collector and the reactor coolant to remove away waste heat. The goals of the sheath insulator test program were to demonstrate that suitable ceramic materials and fabrication processes were available, and to validate the performance of the sheath insulator for TFE-VP requirements. This report discusses the objectives of the test program, fabrication development, ex-reactor test program, in-reactor test program, and the insulator seal specifications

  20. Sheath insulator final test report, TFE Verification Program

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The sheath insulator in a thermionic cell has two functions. First, the sheath insulator must electrically isolate the collector form the outer containment sheath tube that is in contact with the reactor liquid metal coolant. Second, The sheath insulator must provide for high uniform thermal conductance between the collector and the reactor coolant to remove away waste heat. The goals of the sheath insulator test program were to demonstrate that suitable ceramic materials and fabrication processes were available, and to validate the performance of the sheath insulator for TFE-VP requirements. This report discusses the objectives of the test program, fabrication development, ex-reactor test program, in-reactor test program, and the insulator seal specifications.

  1. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  2. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  3. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  4. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  5. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  6. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  7. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  8. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  9. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    seL4 security verification [18] avoids this issue in the same way. In that work, the authors frame their solution as a restriction that disallows...identical: (σ, σ′1) ∈ TM ∧ (σ, σ′2) ∈ TM =⇒ Ol(σ′1) = Ol(σ′2) The successful security verifications of both seL4 and mCertiKOS provide reasonable...evidence that this restriction on specifications is not a major hindrance for usability. Unlike the seL4 verification, however, our framework runs into a

  10. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  11. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  12. Electric and hybrid vehicle self-certification and verification procedures: Market Demonstration Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-03-01

    The process by which a manufacturer of an electric or hybrid vehicle certifies that his vehicle meets the DOE Performance Standards for Demonstration is described. Such certification is required for any vehicles to be purchased under the Market Demonstration Program. It also explains the verification testing process followed by DOE for testing to verify compliance. Finally, the document outlines manufacturer responsibilities and presents procedures for recertification of vehicles that have failed verification testing.

  13. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  14. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  15. Arms control verification costs: the need for a comparative analysis

    International Nuclear Information System (INIS)

    MacLean, G.; Fergusson, J.

    1998-01-01

    The end of the Cold War era has presented practitioners and analysts of international non-proliferation, arms control and disarmament (NACD) the opportunity to focus more intently on the range and scope of NACD treaties and their verification. Aside from obvious favorable and well-publicized developments in the field of nuclear non-proliferation, progress also has been made in a wide variety of arenas, ranging from chemical and biological weapons, fissile material, conventional forces, ballistic missiles, to anti-personnel landmines. Indeed, breaking from the constraints imposed by the Cold War United States-Soviet adversarial zero-sum relationship that impeded the progress of arms control, particularly on a multilateral level, the post Cold War period has witnessed significant developments in NACD commitments, initiatives, and implementation. The goals of this project - in its final iteration - will be fourfold. First, it will lead to the creation of a costing analysis model adjustable for uses in several current and future arms control verification tasks. Second, the project will identify data accumulated in the cost categories outlined in Table 1 in each of the five cases. By comparing costs to overall effectiveness, the application of the model will demonstrate desirability in each of the cases (see Chart 1). Third, the project will identify and scrutinize 'political costs' as well as real expenditures and investment in the verification regimes (see Chart 2). And, finally, the project will offer some analysis on the relationship between national and multilateral forms of arms control verification, as well as the applicability of multilateralism as an effective tool in the verification of international non-proliferation, arms control, and disarmament agreements. (author)

  16. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    Science.gov (United States)

    2016-06-01

    Projects Agency (DARPA), Air Force Research Laboratory (AFRL), Charles River Analytics Inc., and TopCoder, Inc. will be holding a contest to reward...CROWD SOURCED FORMAL VERIFICATION – AUGMENTATION (CSFV-A) CHARLES RIVER ANALYTICS, INC. JUNE 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC...CSFV 5e. TASK NUMBER TC 5f. WORK UNIT NUMBER RA 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Charles River Analytics, Inc. 625 Mount Auburn

  17. Design verification testing for fuel element type CAREM

    International Nuclear Information System (INIS)

    Martin Ghiselli, A.; Bonifacio Pulido, K.; Villabrille, G.; Rozembaum, I.

    2013-01-01

    The hydraulic and hydrodynamic characterization tests are part of the design verification process of a nuclear fuel element prototype and its components. These tests are performed in a low pressure and temperature facility. The tests requires the definition of the simulation parameters for setting the test conditions, the results evaluation to feedback mathematical models, extrapolated the results to reactor conditions and finally to decide the acceptability of the tested prototype. (author)

  18. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  19. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  20. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  1. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  2. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  3. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  4. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    Science.gov (United States)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  5. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    for the review and any actions that were taken when these items were missing are documented in Section 5 of this report. The availability and use of user experience were limited to extensive experience in performing RESRAD-BUILD calculations by the verification project manager and by participation in the RESRAD-BUILD workshop offered by the code developers on May 11, 2001. The level of a posteriori verification that was implemented is defined in Sections 2 through 4 of this report. In general, a rigorous verification review plan addresses program requirements, design, coding, documentation, test coverage, and evaluation of test results. The scope of the RESRAD-BUILD verification is to focus primarily on program requirements, documentation, testing and evaluation. Detailed program design and source code review would be warranted only in those cases when the evaluation of test results and user experience revealed possible problems in these areas. The verification tasks were conducted in three parts and were applied to version 3.1 of the RESRAD-BUILD code and the final version of the user.s manual, issued in November 2001 (Yu (and others) 2001). These parts include the verification of the deterministic models used in RESRAD-BUILD (Section 2), the verification of the uncertainty analysis model included in RESRAD-BUILD (Section 3), and recommendations for improvement of the RESRAD-BUILD user interface, including evaluations of the user's manual, code design, and calculation methodology (Section 4). Any verification issues that were identified were promptly communicated to the RESRAD-BUILD development team, in particular those that arose from the database and parameter verification tasks. This allowed the developers to start implementing necessary database or coding changes well before this final report was issued

  6. ACS Zero Point Verification

    Science.gov (United States)

    Dolphin, Andrew

    2005-07-01

    The uncertainties in the photometric zero points create a fundamental limit to the accuracy of photometry. The current state of the ACS calibration is surprisingly poor, with zero point uncertainties of 0.03 magnitudes. The reason for this is that the ACS calibrations are based primarily on semi-emprical synthetic zero points and observations of fields too crowded for accurate ground-based photometry. I propose to remedy this problem by obtaining ACS images of the omega Cen standard field with all nine broadband ACS/WFC filters. This will permit the direct determination of the ACS zero points by comparison with excellent ground-based photometry, and should reduce their uncertainties to less than 0.01 magnitudes. A second benefit is that it will facilitate the comparison of the WFPC2 and ACS photometric systems, which will be important as WFPC2 is phased out and ACS becomes HST's primary imager. Finally, three of the filters will be repeated from my Cycle 12 observations, allowing for a measurement of any change in sensitivity.

  7. Unified and Modular Modeling and Functional Verification Framework of Real-Time Image Signal Processors

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2016-01-01

    Full Text Available In VLSI industry, image signal processing algorithms are developed and evaluated using software models before implementation of RTL and firmware. After the finalization of the algorithm, software models are used as a golden reference model for the image signal processor (ISP RTL and firmware development. In this paper, we are describing the unified and modular modeling framework of image signal processing algorithms used for different applications such as ISP algorithms development, reference for hardware (HW implementation, reference for firmware (FW implementation, and bit-true certification. The universal verification methodology- (UVM- based functional verification framework of image signal processors using software reference models is described. Further, IP-XACT based tools for automatic generation of functional verification environment files and model map files are described. The proposed framework is developed both with host interface and with core using virtual register interface (VRI approach. This modeling and functional verification framework is used in real-time image signal processing applications including cellphone, smart cameras, and image compression. The main motivation behind this work is to propose the best efficient, reusable, and automated framework for modeling and verification of image signal processor (ISP designs. The proposed framework shows better results and significant improvement is observed in product verification time, verification cost, and quality of the designs.

  8. Assessment of the effectiveness of European air quality policies and measures. Final report on Task 3.3. Survey to access successes and failures of the EU Air Quality Policies

    International Nuclear Information System (INIS)

    2004-01-01

    The main objective of Task 3.3 of the title project was to survey the views of European policy makers and other stakeholders directly involved in air quality policy development and implementation on the successes and failures of the present European air quality policies. The survey also included several decisionmakers from the USA, Japan and Switzerland to learn about these countries' experiences with specific air quality policies. A list of approximately 90 people to be surveyed during the project was developed. The list included representatives from the European Commission, the European Parliament, national-level representatives from the Member States, including those designated by the CAFE Steering Group, along with representatives of local authorities, NGOs, industry and academia. The survey was conducted through a questionnaire and follow-up interviews. The questionnaire consists of four major parts. Part 1 includes questions about the impact of EU legislation on air quality. Part 2 is designed to learn about stakeholder opinions on the adequacy of Community-level measures with respect to air quality protection. Part 3 asks for opinions about various measures used in Community-level legislation on air quality as well as ideas for new or modified measures that could be effective in achieving better air quality in the EU. Part 4 includes questions about stakeholder involvement and transparency and was designed to assist with the implementation of Task 3.4 (on public participation and transparency) of the project. The analysis of responses for this part of the questionnaire is presented in the parallel Report for Task 3.4. The final version of the questionnaire used to interview European stakeholders is attached as Appendix II. For the decision-makers from the USA, Switzerland, and Japan a separate questionnaire was developed, and is attached as Appendix III. In all, the team received 49 responses from the 90 enquiries.

  9. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  10. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  11. Mergers: Success versus failure

    International Nuclear Information System (INIS)

    Carley, G. R.

    1997-01-01

    Successful mergers in the context of long-term value creation, as measured by return realized on investor-provided capital, were discussed. In essence, a successful merger is characterized by being motivated by a sound business reason and strategy for the merger, a reasonable price and sound execution. The acquiror's pre-merger success in managing a company is a good indicator of future success. Poorly managed companies that acquire other companies generally continue to be poorly managed with no significant increase in shareholder value. Prior to the acquisition, identification of the potential target, assessment of the people involved on both sides of the transaction, thorough knowledge of the target's potential for value creation, financial implications (debt, equity, terms and demand, tax implications, the potential effect of the proposed acquisition on the acquiror's business plan) and finally the execution of the process itself, are the important determinants of successful mergers

  12. Design verification for reactor head replacement

    International Nuclear Information System (INIS)

    Dwivedy, K.K.; Whitt, M.S.; Lee, R.

    2005-01-01

    must be negotiated. This paper does not describe the massive efforts required by the NSSS and manufacturer's engineering groups nor does it include the challenges of construction in development of mechanical handling of heavy and large components, or the effort for providing adequate access for the head replacement and restoring the containment structure. The paper outlines the analysis and design efforts needed to support reactor head replacement. The paper concludes that the verification efforts performed by the utility design group not only provide increased assurance of design adequacy, but also serves as an important player on the strong team that is required for a successful head replacement. (authors)

  13. Portal verification for breast cancer radiotherapy

    International Nuclear Information System (INIS)

    Petkovska, Sonja; Pejkovikj, Sasho; Apostolovski, Nebojsha

    2013-01-01

    At the University Clinic in Skopje, breast cancer irradiation is being planned and performed by using a mono-iso centrical method, which means that a unique isocenter (I C) for all irradiation fields is used. The goal of this paper is to present the patient’s position in all coordinates before the first treatment session, relative to the position determined during the CT simulation. Deviation of up to 5 mm is allowed. The analysis was made by using a portal verification. Sixty female patients at random selection are reviewed. The matching results show that for each patient deviation exists at least on one axis. The largest deviations are in the longitudinal direction (head-feet) up to 4 mm, mean 1.8 mm. In 60 out of 85 analysed fields, the deviation is towards the head. In lateral direction, median deviation is 1.1 mm and in 65% of the analysed portals those deviations are in medial direction – contralateral breast which can increases the dose in the lung and in the contralateral breast. This deviation for supraclavicular field can increase the dose in the spinal cord. Although these doses are well below the limit, this fact should be taken into account in setting the treatment fields. The final conclusion from the research is that despite of the fact we are dealing with small deviations, in conditions when accuracy in positioning is done with portal, the portal verification needs to be done in the coming weeks of the treatment, not only before the first treatment. This provides information for an intra fractional set-up deviation. (Author)

  14. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  15. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  16. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  17. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin Fangfang; Gao Qinghuai; Xie Huchen; Nelson, Diana F.; Yu Yan; Kwok, W. Edmund; Totterman, Saara; Schell, Michael C.; Rubin, Philip

    1998-01-01

    Purpose: To investigate a method for the generation of digitally reconstructed radiographs directly from MR images (DRR-MRI) to guide a computerized portal verification procedure. Methods and Materials: Several major steps were developed to perform an MR image-guided portal verification procedure. Initially, a wavelet-based multiresolution adaptive thresholding method was used to segment the skin slice-by-slice in MR brain axial images. Some selected anatomical structures, such as target volume and critical organs, were then manually identified and were reassigned to relatively higher intensities. Interslice information was interpolated with a directional method to achieve comparable display resolution in three dimensions. Next, a ray-tracing method was used to generate a DRR-MRI image at the planned treatment position, and the ray tracing was simply performed on summation of voxels along the ray. The skin and its relative positions were also projected to the DRR-MRI and were used to guide the search of similar features in the portal image. A Canny edge detector was used to enhance the brain contour in both portal and simulation images. The skin in the brain portal image was then extracted using a knowledge-based searching technique. Finally, a Chamfer matching technique was used to correlate features between DRR-MRI and portal image. Results: The MR image-guided portal verification method was evaluated using a brain phantom case and a clinical patient case. Both DRR-CT and DRR-MRI were generated using CT and MR phantom images with the same beam orientation and then compared. The matching result indicated that the maximum deviation of internal structures was less than 1 mm. The segmented results for brain MR slice images indicated that a wavelet-based image segmentation technique provided a reasonable estimation for the brain skin. For the clinical patient case with a given portal field, the MR image-guided verification method provided an excellent match between

  18. Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-10-01

    Full Text Available This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a Neo-empiricism and the gambling metaphor; (b Popperian falsificationism and the scientific tribunal metaphor; (c Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method for testing scientific hypotheses, respectively: (a Decision theoretic Bayesian statistics and Bayes factors; (b Frequentist statistics and p-values; (c Constructive Bayesian statistics and e-values. This article examines with special care the Zero Probability Paradox (ZPP, related to the verification of sharp or precise hypotheses. Finally, this article makes some remarks on Lakatos’ view of mathematics as a quasi-empirical science.

  19. Specification and Verification of Context-dependent Services

    Directory of Open Access Journals (Sweden)

    Naseem Ibrahim

    2011-08-01

    Full Text Available Current approaches for the discovery, specification, and provision of services ignore the relationship between the service contract and the conditions in which the service can guarantee its contract. Moreover, they do not use formal methods for specifying services, contracts, and compositions. Without a formal basis it is not possible to justify through formal verification the correctness conditions for service compositions and the satisfaction of contractual obligations in service provisions. We remedy this situation in this paper. We present a formal definition of services with context-dependent contracts. We define a composition theory of services with context-dependent contracts taking into consideration functional, nonfunctional, legal and contextual information. Finally, we present a formal verification approach that transforms the formal specification of service composition into extended timed automata that can be verified using the model checking tool UPPAAL.

  20. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

  1. The Healy Clean Coal Project: Design verification tests

    International Nuclear Information System (INIS)

    Guidetti, R.H.; Sheppard, D.B.; Ubhayakar, S.K.; Weede, J.J.; McCrohan, D.V.; Rosendahl, S.M.

    1993-01-01

    As part of the Healy Clean Coal Project, TRW Inc., the supplier of the advanced slagging coal combustors, has successfully completed design verification tests on the major components of the combustion system at its Southern California test facility. These tests, which included the firing of a full-scale precombustor with a new non-storage direct coal feed system, supported the design of the Healy combustion system and its auxiliaries performed under Phase 1 of the project. Two 350 million BTU/hr combustion systems have been designed and are now ready for fabrication and erection, as part of Phase 2 of the project. These systems, along with a back-end Spray Dryer Absorber system, designed and supplied by Joy Technologies, will be integrated with a Foster Wheeler boiler for the 50 MWe power plant at Healy, Alaska. This paper describes the design verification tests and the current status of the project

  2. Successful ageing

    DEFF Research Database (Denmark)

    Bülow, Morten Hillgaard; Söderqvist, Thomas

    2014-01-01

    Since the late 1980s, the concept of ‘ successful ageing’ has set the frame for discourse about contemporary ageing research. Through an analysis of the reception to John W. Rowe and Robert L. Kahn's launch of the concept of ‘ successful ageing’ in 1987, this article maps out the important themes...... and discussions that have emerged from the interdisciplinary field of ageing research. These include an emphasis on interdisciplinarity; the interaction between biology, psycho-social contexts and lifestyle choices; the experiences of elderly people; life-course perspectives; optimisation and prevention...... strategies; and the importance of individual, societal and scientific conceptualisations and understandings of ageing. By presenting an account of the recent historical uses, interpretations and critiques of the concept, the article unfolds the practical and normative complexities of ‘ successful ageing’....

  3. Citation Success

    DEFF Research Database (Denmark)

    Vaio, Gianfranco Di; Waldenström, Daniel; Weisdorf, Jacob Louis

    2012-01-01

    This study examines the determinants of citation success among authors who have recently published their work in economic history journals. Besides offering clues about how to improve one's scientific impact, our citation analysis also sheds light on the state of the field of economic history...... find similar patterns when assessing the same authors' citation success in economics journals. As a novel feature, we demonstrate that the diffusion of research — publication of working papers, as well as conference and workshop presentations — has a first-order positive impact on the citation rate........ Consistent with our expectations, we find that full professors, authors appointed at economics and history departments, and authors working in Anglo-Saxon and German countries are more likely to receive citations than other scholars. Long and co-authored articles are also a factor for citation success. We...

  4. Citation Success

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Waldenström, Daniel; Weisdorf, Jacob Louis

    affects citations. In regard to author-specific characteristics, male authors, full professors and authors working economics or history departments, and authors employed in Anglo-Saxon countries, are more likely to get cited than others. As a ‘shortcut' to citation success, we find that research diffusion...

  5. Successful modeling?

    Science.gov (United States)

    Lomnitz, Cinna

    Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.

  6. Successful ageing

    DEFF Research Database (Denmark)

    Kusumastuti, Sasmita; Derks, Marloes G. M.; Tellier, Siri

    2016-01-01

    BACKGROUND: Ageing is accompanied by an increased risk of disease and a loss of functioning on several bodily and mental domains and some argue that maintaining health and functioning is essential for a successful old age. Paradoxically, studies have shown that overall wellbeing follows a curvili...

  7. Canister Storage Building Receiving Pit Modification Informal Design Verification

    International Nuclear Information System (INIS)

    KRIEG, S.A.

    2000-01-01

    The design for modifications to the CSB Cask Receiving pit guides was verified by the informal design verification (meeting) method on August 9, 2000. The invited list of attendees and the meeting attendance sheet are included in attachment 1. The design modifications that were reviewed are documented in ECN 654484 (attachment 2). The requirement that the design is to be verified against is to ''center the transportation cask sufficiently to allow installation of the guide funnel on the cask (± 0.25 inches or less)''. The alternatives considered are detailed in attachment 3. Alternative number 4, ''Modify The Pit Guides'', was determined to be the preferred alternative primarily due to considerations of simplicity, reliability, and low cost. Alternative 1, ''Rotate the impact Absorber 180 o '', was successfully performed but was considered a temporary fix that was not acceptable for a long term operational mode. The requirement to position the receiving crane accurately enough to lower the transportation cask into the pit with the redesigned guides was discussed and considered to be achievable without undue effort from the operator. The tolerance on the OD of the transfer cask was discussed (± 1/8 inch) relative to the clearance with the guides. As-built dimensions for the cask OD will be looked at to verify sufficient clearance exists with the maximum cask OD. The final design thickness of the shims under the guides will be based on the as-built cask OD dimensions and field measurements between the pit guides. The need for a ''plastic'' cover for the guides was discussed and deemed unnecessary. Thermal growth of the cask OD was calculated at 3-5 mils and considered insignificant. The possibility of reducing the OD of the guide funnel was reviewed but this was considered impractical due to the requirement for the MCO to miss the edge of the funnel in case of a MCO drop. One of the transportation casks have the lift trunions installed 3/8 inch off center. This is

  8. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  9. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  10. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  11. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  12. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  13. SU-E-J-115: Graticule for Verification of Treatment Position in Neutron Therapy.

    Science.gov (United States)

    Halford, R; Snyder, M

    2012-06-01

    Until recently the treatment verification for patients undergoing fast neutron therapy at our facility was accomplished through a combination of neutron beam portal films aligned with a graticule mounted on an orthronormal x-ray tube. To eliminate uncertainty with respect to the relative positions of the x-ray graticule and the therapy beam, we have developed a graticule which is placed in the neutron beam itself. For a graticule to be visible on the portal film, the attenuation of the neutron beam by the graticule landmarks must be significantly greater than that of the material in which the landmarks are mounted. Various materials, thicknesses, and mounting points were tried to gain the largest contrast between the graticule landmarks and the mounting material. The final design involved 2 inch steel pins of 0.125 inch diameter captured between two parallel plates of 0.25 inch thick clear acrylic plastic. The distance between the two acrylic plates was 1.625 inches, held together at the perimeter with acrylic sidewall spacers. This allowed the majority of length of the steel pins to be surrounded by air. The pins were set 1 cm apart and mounted at angles parallel to the divergence of the beam dependent on their position within the array. The entire steel pin and acrylic plate assembly was mounted on an acrylic accessory tray to allow for graticule alignment. Despite the inherent difficulties in attenuating fast neutrons, our simple graticule design produces the required difference of attenuation between the arrays of landmarks and the mounting material. The graticule successfully provides an in-beam frame of reference for patient portal verification. © 2012 American Association of Physicists in Medicine.

  14. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  15. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  17. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  18. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    Energy Technology Data Exchange (ETDEWEB)

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P. [Euratom, Communaute europeenne de l' energie atomique - CEEA (European Commission (EC))

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  19. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  20. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  1. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  2. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  3. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  4. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  5. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  6. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  7. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  8. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  9. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  10. Role of IGRT in patient positioning and verification

    International Nuclear Information System (INIS)

    Mijnheer, Ben

    2008-01-01

    Image-guided radiation therapy is 'Frequent imaging in the treatment room during a course of radiotherapy to guide the treatment process'. Instrumentation related to IGRT is highlighted. Focus of the lecture was on clinical experience gained by NKI-AVL, such as the use of EPID (electronic portal imaging devices) and CBCT (cone beam computed tomography) and their comparison: good results for head and neck and prostate/bladder patients: portal imaging was replaced by CBCT. After further investigation convincing results for lung patients were obtained: portal imaging was replaced by CBCT. Scan protocols were developed for these patient groups. Since February 2004 CBCT-based decision rules have been developed for: Head and Neck (Bony anatomy); Prostate (Bony anatomy; Soft tissue registration); Lung (Bony anatomy, Soft tissue registration); Brain (Bony anatomy); and Breast, bladder and liver (in progress). Final remarks are as follows: The introduction of various IGRT techniques allowed 3D verification of the position of target volumes and organs at risk just before or during treatment. Because the information is in 3D, or sometimes even in 4D, in principle these IGRT approaches provide more information compared to the use of 2D verification methods (e.g. EPIDs). Clinical data are becoming available to assess quantitatively for which treatment techniques IGRT approaches are advantageous compared to the use of conventional verification methods taking the additional resources (time, money, manpower) into account. (P.A.)

  11. A Formal Verification Method of Function Block Diagram

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun; Jee, Eun Kyoung; Jeon, Seung Jae; Park, Gee Yong; Kwon, Kee Choon

    2007-01-01

    Programmable Logic Controller (PLC), an industrial computer specialized for real-time applications, is widely used in diverse control systems in chemical processing plants, nuclear power plants or traffic control systems. As a PLC is often used to implement safety, critical embedded software, rigorous safety demonstration of PLC code is necessary. Function block diagram (FBD) is a standard application programming language for the PLC and currently being used in the development of a fully-digitalized reactor protection system (RPS), which is called the IDiPS, under the KNICS project. Therefore, verification issue of FBD programs is a pressing problem, and hence is of great importance. In this paper, we propose a formal verification method of FBD programs; we defined FBD programs formally in compliance with IEC 61131-3, and then translate the programs into Verilog model, and finally the model is verified using a model checker SMV. To demonstrate the feasibility and effective of this approach, we applied it to IDiPS which currently being developed under KNICS project. The remainder of this paper is organized as follows. Section 2 briefly describes Verilog and Cadence SMV. In Section 3, we introduce FBD2V which is a tool implemented to support the proposed FBD verification framework. A summary and conclusion are provided in Section 4

  12. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  13. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  14. Investigation of novel spent fuel verification system for safeguard application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source.

  15. Investigation of novel spent fuel verification system for safeguard application

    International Nuclear Information System (INIS)

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source

  16. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    International Nuclear Information System (INIS)

    Doyle, James E.; Meek, Elizabeth

    2009-01-01

    near-term (1-4) years and longer-term (5-10) years planning horizons. Some final observations include acknowledging the enduring nature of several key objectives on the Obama Administration's arms control and nonproliferation agenda. The CTBT, FMCT, bilateral nuclear arms reductions and strengthening the NPT have been sought by successive U.S. Administrations for nearly thirty years. Efforts towards negotiated arms control, although de-emphasized by the G.W. Bush Administration, have remained a pillar of U.S. national security strategy for decades and are likely to be of enduring if not increasing importance for decades to come. Therefore revitalization and expansion of USG capabilities in this area can be a positive legacy no matter what near-term arms control goals are achieved over the next four years. This is why it is important to reconstruct integrated bureaucratic, legislative, budgetary and diplomatic strategies to sustain the arms control and nonproliferation agenda. In this endeavor some past lessons must be taken to heart to avoid bureaucratic overkill and keep interagency policy-making and implementation structures lean and effective. On the Technical side a serious, sustained multilateral program to develop, down select and performance test nuclear weapons dismantlement verification technologies and procedures should be immediately initiated. In order to make this happen the United States and Russia should join with the UK and other interested states in creating a sustained, full-scale research and development program for verification at their respective nuc1ear weapons and defense establishments. The goals include development of effective technologies and procedures for: (1) Attribute measurement systems to certify nuclear warheads and military fissile materials; (2) Chain-of-custody methods to track items after they are authenticated and enter accountability; (3) Transportation monitoring; (4) Storage monitoring; (5) Fissile materials conversion

  17. Nuclear power plant C and I design verification by simulation

    International Nuclear Information System (INIS)

    Storm, Joachim; Yu, Kim; Lee, D.Y

    2003-01-01

    An important part of the Advanced Boiling Water Reactor (ABWR) in the Taiwan NPP Lungmen Units no.1 and no.2 is the Full Scope Simulator (FSS). The simulator was to be built according to design data and therefore, apart from the training aspect, a major part of the development is to apply a simulation based test bed for the verification, validation and improvement of plant design in the control and instrumentation (C and I) areas of unit control room equipment, operator Man Machine Interface (MMI), process computer functions and plant procedures. Furthermore the Full Scope Simulator will be used after that to allow proper training of the plant operators two years before Unit no.1 fuel load. The article describes scope, methods and results of the advanced verification and validation process and highlights the advantages of test bed simulation for real power plant design and implementation. Subsequent application of advanced simulation software tools like instrumentation and control translators, graphical model builders, process models, graphical on-line test tools and screen based or projected soft panels, allowed a team to fulfil the task of C and I verification in time before the implementation of the Distributed Control and Information System (DCIS) started. An additional area of activity was the Human Factors Engineering (HFE) for the operator MMI. Due to the fact that the ABWR design incorporates a display-based operation with most of the plant components, a dedicated verification and validation process is required by NUREG-0711. In order to support this activity an engineering test system had been installed for all the necessary HFE investigations. All detected improvements had been properly documented and used to update the plant design documentation by a defined process. The Full Scope Simulator (FSS) with hard panels and stimulated digital control and information system are in the final acceptance test process with the end customer, Taiwan Power Company

  18. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  19. Nonlinear harmonic generation and proposed experimental verification in SASE FELs

    CERN Document Server

    Freund, H P; Milton, S V

    2000-01-01

    Recently, a 3D, polychromatic, nonlinear simulation code was developed to study the growth of nonlinear harmonics in self-amplified spontaneous emission (SASE) free-electron lasers (FELs). The simulation was applied to the parameters for each stage of the Advanced Photon Source (APS) SASE FEL, intended for operation in the visible, UV, and short UV wavelength regimes, respectively, to study the presence of nonlinear harmonic generation. Significant nonlinear harmonic growth is seen. Here, a discussion of the code development, the APS SASE FEL, the simulations and results, and, finally, the proposed experimental procedure for verification of such nonlinear harmonic generation at the APS SASE FEL will be given.

  20. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Science.gov (United States)

    2010-07-22

    ... Electronic Signature and Storage of Form I-9, Employment Eligibility Verification AGENCY: U.S. Immigration... published an interim final rule to permit electronic signature and storage of the Form I-9. 71 FR 34510... because electronic signature and storage technologies are optional, DHS expects that small entities will...

  1. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    Science.gov (United States)

    2013-01-25

    ... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...

  2. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  3. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  4. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  5. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  6. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  7. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  8. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  9. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  10. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  11. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  12. Successful Aging

    Directory of Open Access Journals (Sweden)

    Taufiqurrahman Nasihun

    2015-06-01

    Full Text Available The emerging concept of successful aging is based on evidence that in healthy individual when they get aged, there are  considerable variations in physiological functions alteration. Some people exhibiting greater, but others very few or no age related alteration. The first is called poor aging and the later is called successful pattern of aging (Lambert SW, 2008. Thus, in the simple words the successful aging concept is define as an opportunity of old people to stay  active and productive condition despite they get aged chronologically. Aging itself might be defined as the progressive accumulation of changes with time associated with or responsible for the ever-increasing susceptibility to disease and death which accompanies advancing age (Harman D, 1981. The time needed to accumulate changes is attributable to aging process. The marked emerging questions are how does aging happen and where does aging start? To answer these questions and because of the complexity of aging process, there are more than 300 aging theories have been proposed to explain how and where aging occured and started respectively. There are too many to enumerate theories and classification of aging process. In summary, all of these aging theories can be grouped into three clusters: 1. Genetics program theory, this theory suggests that aging is resulted from program directed by the genes; 2. Epigenetic theory, in these theory aging is resulted from environmental random events not determined by the genes; 3. Evolutionary theory, which propose that aging is a medium for disposal mortal soma in order to avoid competition between organism and their progeny for food and space, did not try to explain how aging occur, but possibly answer why aging occur (De la Fuente. 2009. Among the three groups of aging theories, the epigenetic theory is useful to explain and try to solve the enigma of aging which is prominently caused by internal and external environmental influences

  13. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  14. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  15. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  16. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  17. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  18. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  19. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  20. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  1. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  2. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  3. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  4. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Gurney, Kevin R. [Arizona Univ., Mesa, AZ (United States)

    2015-01-12

    This document constitutes the final report under DOE grant DE-FG-08ER64649. The organization of this document is as follows: first, I will review the original scope of the proposed research. Second, I will present the current draft of a paper nearing submission to Nature Climate Change on the initial results of this funded effort. Finally, I will present the last phase of the research under this grant which has supported a Ph.D. student. To that end, I will present the graduate student’s proposed research, a portion of which is completed and reflected in the paper nearing submission. This final work phase will be completed in the next 12 months. This final workphase will likely result in 1-2 additional publications and we consider the results (as exemplified by the current paper) high quality. The continuing results will acknowledge the funding provided by DOE grant DE-FG-08ER64649.

  5. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    DeTar, Carleton [P.I.

    2012-12-10

    This document constitutes the Final Report for award DE-FC02-06ER41446 as required by the Office of Science. It summarizes accomplishments and provides copies of scientific publications with significant contribution from this award.

  6. Systemverilog for verification a guide to learning the testbench language features

    CERN Document Server

    Spear, Chris

    2012-01-01

    Based on the highly successful second edition, this extended edition of SystemVerilog for Verification: A Guide to Learning the Testbench Language Features teaches all verification features of the SystemVerilog language, providing hundreds of examples to clearly explain the concepts and basic fundamentals. It contains materials for both the full-time verification engineer and the student learning this valuable skill. In the third edition, authors Chris Spear and Greg Tumbush start with how to verify a design, and then use that context to demonstrate the language features,  including the advantages and disadvantages of different styles, allowing readers to choose between alternatives. This textbook contains end-of-chapter exercises designed to enhance students’ understanding of the material. Other features of this revision include: New sections on static variables, print specifiers, and DPI from the 2009 IEEE language standard Descriptions of UVM features such as factories, the test registry, and the config...

  7. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  8. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  9. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  10. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  11. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  12. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  13. Study of applicable methods on safety verification of disposal facilities and waste packages

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Three subjects about safety verification on the disposal of low level radioactive waste were investigated in FY. 2012. For radioactive waste disposal facilities, specs and construction techniques of covering with soil to prevent possible destruction caused by natural events (e.g. earthquake) were studied to consider verification methods for those specs. For waste packages subject to near surface pit disposal, settings of scaling factor and average radioactivity concentration (hereafter referred to as ''SF'') on container-filled and solidified waste packages generated from Kashiwazaki Kariwa Nuclear Power Station Unit 1-5, setting of cesium residual ratio of molten solidified waste generated from Tokai and Tokai No.2 Power Stations, etc. were studied. Those results were finalized in consideration of the opinion from advisory panel, and publicly opened as JNES-EV reports. In FY 2012, five JNES reports were published and these have been used as standards of safety verification on waste packages. The verification method of radioactive wastes subject to near-surface trench disposal and intermediate depth disposal were also studied. For radioactive wastes which will be returned from overseas, determination methods of radioactive concentration, heat rate and hydrogen generation rate of CSD-C were established. Determination methods of radioactive concentration and heat rate of CSD-B were also established. These results will be referred to verification manuals. (author)

  14. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  15. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  16. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  17. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  18. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  19. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    VandV for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a VandV process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, VandV cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model VandV is fundamentally different from software VandV. Code developers developing computer programs perform software VandV to ensure code correctness, reliability, and robustness. In model VandV, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model VandV guidelines and procedures. The expected outcome of the model VandV process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful VandV program. This objective is motivated by the need for

  20. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin, F.-F.; Gao, Q.H.; Xie, H.; Nelson, D.F.; Yu, Y.; Kwok, W.E.; Totterman, S.; Schell, M.C.; Rubin, P.

    1996-01-01

    and marrow information within the skull. Next, a ray-tracing method is used to generate a projection (pseudo-portal) image at the planned treatment position. In this situation, the ray-tracing is simply performed on pixels rather than attenuation coefficients. The skull and its relative positions are also projected to the pseudo-portal image and are used as 'hint' for the search of similar features in the portal images. A Canny edge detector is applied to the region of treatment field and is used to enhance brain contour and skull. The skull in the brain is then identified using a snake technique which is guided by the ''hint'', the projected features from MR images. Finally, a Chamfer matching technique is used to correlate features between the MR projection and portal images. Results: MR image-guided portal verification technique is evaluated using a clinical patient case who has an astrocytoma brain tumor and is treated by radiation therapy. The segmented results for brain MR slice images indicate that a wavelet-based image segmentation technique provides a reasonable estimation for the brain skull. Compared to the brain portal image, the method developed in this study for the generation of brain projection images provides skull structure about 3 mm differences. However, overall matching results are within 2 mm compared to the results between portal and simulation images. In addition, tumor volume can be accurately visualized in the projection image and be mapped over to portal images for treatment verification with this approach. Conclusions: A method for MR image-guided portal verification of brain treatment field is being developed. Although the projection image from MR images dose not have the similar radiographic appearance as portal images, it provides certain essential anatomical features (landmarks and gross tumor) as well as their relative locations to be used as references for computerized portal verification

  1. Narrative Finality

    Directory of Open Access Journals (Sweden)

    Armine Kotin Mortimer

    1981-01-01

    Full Text Available The cloturai device of narration as salvation represents the lack of finality in three novels. In De Beauvoir's Tous les hommes sont mortels an immortal character turns his story to account, but the novel makes a mockery of the historical sense by which men define themselves. In the closing pages of Butor's La Modification , the hero plans to write a book to save himself. Through the thrice-considered portrayal of the Paris-Rome relationship, the ending shows the reader how to bring about closure, but this collective critique written by readers will always be a future book. Simon's La Bataille de Pharsale , the most radical attempt to destroy finality, is an infinite text. No new text can be written. This extreme of perversion guarantees bliss (jouissance . If the ending of De Beauvoir's novel transfers the burden of non-final world onto a new victim, Butor's non-finality lies in the deferral to a future writing, while Simon's writer is stuck in a writing loop, in which writing has become its own end and hence can have no end. The deconstructive and tragic form of contemporary novels proclaims the loss of belief in a finality inherent in the written text, to the profit of writing itself.

  2. M3 version 3.0: Verification and validation

    International Nuclear Information System (INIS)

    Gomez, Javier B.; Laaksoharju, Marcus; Skaarman, Erik; Gurban, Ioana

    2009-01-01

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  3. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  4. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  5. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  6. Commitment to COT verification improves patient outcomes and financial performance.

    Science.gov (United States)

    Maggio, Paul M; Brundage, Susan I; Hernandez-Boussard, Tina; Spain, David A

    2009-07-01

    After an unsuccessful American College of Surgery Committee on Trauma visit, our level I trauma center initiated an improvement program that included (1) hiring new personnel (trauma director and surgeons, nurse coordinator, orthopedic trauma surgeon, and registry staff), (2) correcting deficiencies in trauma quality assurance and process improvement programs, and (3) development of an outreach program. Subsequently, our trauma center had two successful verifications. We examined the longitudinal effects of these efforts on volume, patient outcomes and finances. The Trauma Registry was used to derive data for all trauma patients evaluated in the emergency department from 2001 to 2007. Clinical data analyzed included number of admissions, interfacility transfers, injury severity scores (ISS), length of stay, and mortality for 2001 to 2007. Financial performance was assessed for fiscal years 2001 to 2007. Data were divided into patients discharged from the emergency department and those admitted to the hospital. Admissions increased 30%, representing a 7.6% annual increase (p = 0.004), mostly due to a nearly fivefold increase in interfacility transfers. Severe trauma patients (ISS >24) increased 106% and mortality rate for ISS >24 decreased by 47% to almost half the average of the National Trauma Database. There was a 78% increase in revenue and a sustained increase in hospital profitability. A major hospital commitment to Committee on Trauma verification had several salient outcomes; increased admissions, interfacility transfers, and acuity. Despite more seriously injured patients, there has been a major, sustained reduction in mortality and a trend toward decreased intensive care unit length of stay. This resulted in a substantial increase in contribution to margin (CTM), net profit, and revenues. With a high level of commitment and favorable payer mix, trauma center verification improves outcomes for both patients and the hospital.

  7. Final Report

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik; Nielsen, Peter V.

    This final report for the Hybrid Ventilation Centre at Aalborg University describes the activities and research achievement in the project period from August 2001 to August 2006. The report summarises the work performed and the results achieved with reference to articles and reports published...

  8. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  9. Co-verification of hardware and software for ARM SoC design

    CERN Document Server

    Andrews, Jason

    2004-01-01

    Hardware/software co-verification is how to make sure that embedded system software works correctly with the hardware, and that the hardware has been properly designed to run the software successfully -before large sums are spent on prototypes or manufacturing. This is the first book to apply this verification technique to the rapidly growing field of embedded systems-on-a-chip(SoC). As traditional embedded system design evolves into single-chip design, embedded engineers must be armed with the necessary information to make educated decisions about which tools and methodology to deploy. SoC verification requires a mix of expertise from the disciplines of microprocessor and computer architecture, logic design and simulation, and C and Assembly language embedded software. Until now, the relevant information on how it all fits together has not been available. Andrews, a recognized expert, provides in-depth information about how co-verification really works, how to be successful using it, and pitfalls to avoid. H...

  10. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  11. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    , including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  12. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    , including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  13. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  14. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  15. Novel Verification Method for Timing Optimization Based on DPSO

    Directory of Open Access Journals (Sweden)

    Chuandong Chen

    2018-01-01

    Full Text Available Timing optimization for logic circuits is one of the key steps in logic synthesis. Extant research data are mainly proposed based on various intelligence algorithms. Hence, they are neither comparable with timing optimization data collected by the mainstream electronic design automation (EDA tool nor able to verify the superiority of intelligence algorithms to the EDA tool in terms of optimization ability. To address these shortcomings, a novel verification method is proposed in this study. First, a discrete particle swarm optimization (DPSO algorithm was applied to optimize the timing of the mixed polarity Reed-Muller (MPRM logic circuit. Second, the Design Compiler (DC algorithm was used to optimize the timing of the same MPRM logic circuit through special settings and constraints. Finally, the timing optimization results of the two algorithms were compared based on MCNC benchmark circuits. The timing optimization results obtained using DPSO are compared with those obtained from DC, and DPSO demonstrates an average reduction of 9.7% in the timing delays of critical paths for a number of MCNC benchmark circuits. The proposed verification method directly ascertains whether the intelligence algorithm has a better timing optimization ability than DC.

  16. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    International Nuclear Information System (INIS)

    Mueller, Christina; Oeberg, Tomas

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  17. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance

  18. Strategy for verification and demonstration of the sealing process for canisters for spent fuel

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Christina [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany); Oeberg, Tomas [Tomas Oeberg Konsult AB, Lyckeby (Sweden)

    2004-08-01

    Electron beam welding and friction stir welding are the two processes now being considered for sealing copper canisters with Sweden's radioactive waste. This report outlines a strategy for verification and demonstration of the encapsulation process which here is considered to consist of the sealing of the canister by welding followed by quality control of the weld by non-destructive testing. Statistical methodology provides a firm basis for modern quality technology and design of experiments has been successful part of it. Factorial and fractional factorial designs can be used to evaluate main process factors and their interactions. Response surface methodology with multilevel designs enables further optimisation. Empirical polynomial models can through Taylor series expansions approximate the true underlying relationships sufficiently well. The fitting of response measurements is based on ordinary least squares regression or generalised linear methods. Unusual events, like failures in the lid welds, are best described with extreme value statistics and the extreme value paradigm give a rationale for extrapolation. Models based on block maxima (the generalised extreme value distribution) and peaks over threshold (the generalised Pareto distribution) are considered. Experiences from other fields of the materials sciences suggest that both of these approaches are useful. The initial verification experiments of the two welding technologies considered are suggested to proceed by experimental plans that can be accomplished with only four complete lid welds each. Similar experimental arrangements can be used to evaluate process 'robustness' and optimisation of the process window. Two series of twenty demonstration trials each, mimicking assembly-line production, are suggested as a final evaluation before the selection of welding technology. This demonstration is also expected to provide a data base suitable for a baseline estimate of future performance. This estimate can

  19. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stinis, Panos [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-08-07

    This is the final report for the work conducted at the University of Minnesota (during the period 12/01/12-09/18/14) by PI Panos Stinis as part of the "Collaboratory on Mathematics for Mesoscopic Modeling of Materials" (CM4). CM4 is a multi-institution DOE-funded project whose aim is to conduct basic and applied research in the emerging field of mesoscopic modeling of materials.

  20. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  1. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  2. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  3. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  4. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. Nuclear Nonproliferation Ontology Assessment Team Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Strasburg, Jana D.; Hohimer, Ryan E.

    2012-01-01

    Final Report for the NA22 Simulations, Algorithm and Modeling (SAM) Ontology Assessment Team's efforts from FY09-FY11. The Ontology Assessment Team began in May 2009 and concluded in September 2011. During this two-year time frame, the Ontology Assessment team had two objectives: (1) Assessing the utility of knowledge representation and semantic technologies for addressing nuclear nonproliferation challenges; and (2) Developing ontological support tools that would provide a framework for integrating across the Simulation, Algorithm and Modeling (SAM) program. The SAM Program was going through a large assessment and strategic planning effort during this time and as a result, the relative importance of these two objectives changed, altering the focus of the Ontology Assessment Team. In the end, the team conducted an assessment of the state of art, created an annotated bibliography, and developed a series of ontological support tools, demonstrations and presentations. A total of more than 35 individuals from 12 different research institutions participated in the Ontology Assessment Team. These included subject matter experts in several nuclear nonproliferation-related domains as well as experts in semantic technologies. Despite the diverse backgrounds and perspectives, the Ontology Assessment team functioned very well together and aspects could serve as a model for future inter-laboratory collaborations and working groups. While the team encountered several challenges and learned many lessons along the way, the Ontology Assessment effort was ultimately a success that led to several multi-lab research projects and opened up a new area of scientific exploration within the Office of Nuclear Nonproliferation and Verification.

  6. Entanglement verification and its applications in quantum communication

    International Nuclear Information System (INIS)

    Haeseler, Hauke

    2010-01-01

    coherent storage of light, we focus on the storage of squeezed light. This situation requires an extension of our verification procedure to sources of mixed input states. We propose such an extension, and give a detailed analysis of its application to squeezed thermal states, displaced thermal states and mixed qubit states. This is supplemented by finding the optimal entanglement-breaking channels for each of these situations, which provides us with an indication of the strength of the extension to our entanglement criterion. The subject of Chapter 6 is also the benchmarking of quantum memory or teleportation experiments. Considering a number of recently published benchmark criteria, we investigate the question which one is most useful to actual experiments. We first compare the different criteria for typical settings and sort them according to their resilience to excess noise. Then, we introduce a further improvement to the Expectation Value Matrix method, which results in the desired optimal benchmark criterion. Finally, we investigate naturally occurring phase fluctuations and find them to further simplify the implementation of our criterion. Thus, we formulate the first truly useful way of validating experiments for the quantum storage or transmission of light. (orig.)

  7. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  8. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  9. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  10. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  11. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  12. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  13. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  14. The cumulative verification image analysis tool for offline evaluation of portal images

    International Nuclear Information System (INIS)

    Wong, John; Yan Di; Michalski, Jeff; Graham, Mary; Halverson, Karen; Harms, William; Purdy, James

    1995-01-01

    Purpose: Daily portal images acquired using electronic portal imaging devices contain important information about the setup variation of the individual patient. The data can be used to evaluate the treatment and to derive correction for the individual patient. The large volume of images also require software tools for efficient analysis. This article describes the approach of cumulative verification image analysis (CVIA) specifically designed as an offline tool to extract quantitative information from daily portal images. Methods and Materials: The user interface, image and graphics display, and algorithms of the CVIA tool have been implemented in ANSCI C using the X Window graphics standards. The tool consists of three major components: (a) definition of treatment geometry and anatomical information; (b) registration of portal images with a reference image to determine setup variation; and (c) quantitative analysis of all setup variation measurements. The CVIA tool is not automated. User interaction is required and preferred. Successful alignment of anatomies on portal images at present remains mostly dependent on clinical judgment. Predefined templates of block shapes and anatomies are used for image registration to enhance efficiency, taking advantage of the fact that much of the tool's operation is repeated in the analysis of daily portal images. Results: The CVIA tool is portable and has been implemented on workstations with different operating systems. Analysis of 20 sequential daily portal images can be completed in less than 1 h. The temporal information is used to characterize setup variation in terms of its systematic, random and time-dependent components. The cumulative information is used to derive block overlap isofrequency distributions (BOIDs), which quantify the effective coverage of the prescribed treatment area throughout the course of treatment. Finally, a set of software utilities is available to facilitate feedback of the information for

  15. Ensuring a successful family business management succession

    OpenAIRE

    Desbois, Joris

    2016-01-01

    Succession is the biggest long-term challenge that most family businesses face. Indeed, leaders ‘disposition to plan for their succession is frequently the key factor defining whether their family business subsists or stops. The research seeks to find out how to manage successfully the business management succession over main principles. This work project aims at researching the key points relevant to almost all family firms, to have a viable succession transition and positioni...

  16. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  17. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  18. Isotope correlation verification of analytical measurements for dissolver materials

    International Nuclear Information System (INIS)

    Satkowski, J.

    1988-01-01

    An independent verification of analytical results for accountability measurements of dissolver materials can be performed using the Iosotop Correlation Technique (ICT). ICT is based on the relationships that exist between the initial and final elemental concentration and isotopic abundances of the nuclear fuel. Linear correlation functions between isotopic ratios and plutonium/uranium ratios have been developed for specific reactor fuels. The application of these correlations to already existing analytical data provides a laboratory additional confidence in the reported results. Confirmation is done by a test of consistancy with historical data. ICT is being utilized with dissolver accountability measurements at the Savannah River Plant Laboratory. The application, implementation, and operating experience of this technique are presented

  19. Cassini's Test Methodology for Flight Software Verification and Operations

    Science.gov (United States)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  20. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  1. Technique for unit testing of safety software verification and validation

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2008-01-01

    The key issue arising from digitalization of the reactor protection system for nuclear power plant is how to carry out verification and validation (V and V), to demonstrate and confirm the software that performs reactor safety functions is safe and reliable. One of the most important processes for software V and V is unit testing, which verifies and validates the software coding based on concept design for consistency, correctness and completeness during software development. The paper shows a preliminary study on the technique for unit testing of safety software V and V, focusing on such aspects as how to confirm test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed here was successfully used in the work of unit testing on safety software of a digital reactor protection system. (authors)

  2. Dose concentration and dose verification for radiotherapy of cancer

    International Nuclear Information System (INIS)

    Maruyama, Koichi

    2005-01-01

    The number of cancer treatments using radiation therapy is increasing. The background of this increase is the accumulated fact that the number of successful cases is comparative to or even better than surgery for some types of cancer due to the improvement in irradiation technology and radiation planning technology. This review describes the principles and technology of radiation therapy, its characteristics, particle therapy that improves the dose concentration, its historical background, the importance of dose concentration, present situation and future possibilities. There are serious problems that hinder the superior dose concentration of particle therapy. Recent programs and our efforts to solve these problems are described. A new concept is required to satisfy the notion of evidence based medicine, i.e., one has to develop a method of dose verification, which is not yet available. This review is for researchers, medical doctors and radiation technologists who are developing this field. (author)

  3. European Train Control System: A Case Study in Formal Verification

    Science.gov (United States)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  5. Combining Multiple Features for Text-Independent Writer Identification and Verification

    OpenAIRE

    Bulacu , Marius; Schomaker , Lambert

    2006-01-01

    http://www.suvisoft.com; In recent years, we proposed a number of new and very effective features for automatic writer identification and verification. They are probability distribution functions (PDFs) extracted from the handwriting images and characterize writer individuality independently of the textual content of the written samples. In this paper, we perform an extensive analysis of feature combinations. In our fusion scheme, the final unique distance between two handwritten samples is c...

  6. Simulated physical inventory verification exercise at a mixed-oxide fuel fabrication facility

    International Nuclear Information System (INIS)

    Reilly, D.; Augustson, R.

    1985-01-01

    A physical inventory verification (PIV) was simulated at a mixed-oxide fuel fabrication facility. Safeguards inspectors from the International Atomic Energy Agency (IAEA) conducted the PIV exercise to test inspection procedures under ''realistic but relaxed'' conditions. Nondestructive assay instrumentation was used to verify the plutonium content of samples covering the range of material types from input powders to final fuel assemblies. This paper describes the activities included in the exercise and discusses the results obtained. 5 refs., 1 fig., 6 tabs

  7. Online Signature Verification using Recurrent Neural Network and Length-normalized Path Signature

    OpenAIRE

    Lai, Songxuan; Jin, Lianwen; Yang, Weixin

    2017-01-01

    Inspired by the great success of recurrent neural networks (RNNs) in sequential modeling, we introduce a novel RNN system to improve the performance of online signature verification. The training objective is to directly minimize intra-class variations and to push the distances between skilled forgeries and genuine samples above a given threshold. By back-propagating the training signals, our RNN network produced discriminative features with desired metrics. Additionally, we propose a novel d...

  8. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  9. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  10. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  11. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  12. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  13. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires...

  14. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  15. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  17. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  18. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  19. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  20. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  1. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  2. Zero leakage quantization scheme for biometric verification

    NARCIS (Netherlands)

    Groot, de J.A.; Linnartz, J.P.M.G.

    2011-01-01

    Biometrics gain increasing interest as a solution for many security issues, but privacy risks exist in case we do not protect the stored templates well. This paper presents a new verification scheme, which protects the secrets of the enrolled users. We will show that zero leakage is achieved if

  3. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  4. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  5. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  6. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  7. Decommissioning. Success with preparation

    International Nuclear Information System (INIS)

    Klasen, Joerg; Schulz, Rolf; Wilhelm, Oliver

    2017-01-01

    The decommissioning of a nuclear power plant poses a significant challenge for the operating company. The business model is turned upside down and a working culture developed for power operation has to be adapted while necessary know- how for the upcoming tasks has to be built up. The trauma for the employees induced by the final plant shut-down has to be considered and respected. The change of working culture in the enterprise has to be managed and the organization has to be prepared for the future. Here the methods of Change-Management offer a systematic and effective approach. Confidence in the employee's competencies is one of the key success factors for the change into the future.

  8. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jarillo-Herrero, Pablo [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-02-07

    This is the final report of our research program on electronic transport experiments on Topological Insulator (TI) devices, funded by the DOE Office of Basic Energy Sciences. TI-based electronic devices are attractive as platforms for spintronic applications, and for detection of emergent properties such as Majorana excitations , electron-hole condensates , and the topological magneto-electric effect . Most theoretical proposals envision geometries consisting of a planar TI device integrated with materials of distinctly different physical phases (such as ferromagnets and superconductors). Experimental realization of physics tied to the surface states is a challenge due to the ubiquitous presence of bulk carriers in most TI compounds as well as degradation during device fabrication.

  9. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  10. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Brantley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  11. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  12. Verification of international environmental agreements

    International Nuclear Information System (INIS)

    Ausubel, J.H.; Victor, D.G.

    1992-01-01

    Problems and opportunities frequently cross national borders. Informal and formal international arrangements-loosely termed regimes, defined in this paper as systems of rule or government that have widespread influence-arise for the collective management of such transboundary issues. Regimes are pervasive; their number and extent have grown markedly in the 20th century, especially since the Second World War. Students of the international system study the conditions under which regimes are formed and the factors that contribute to their success. These include distribution of power among states, the nature of the issue, its linkages to other issues, the roles and functions of international organizations, the processes of bargaining and rule-making, and the influence of domestic politics. Scholars also theorize how regimes are maintained and changed. In the past two decades students of international cooperation have increasingly applied their tools to issues of the environment and natural resources

  13. Final integration and alignment of LINC-NIRVANA

    Science.gov (United States)

    Moreno-Ventas, Javier; Bizenberger, Peter; Bertram, Thomas; Radhakrishnan, Kalyan K.; Kittmann, Frank; Baumeister, Harald; Marafatto, Luca; Mohr, Lars; Herbst, Tom

    2016-08-01

    The LBT (Large Binocular Telescope), located at about 3200m on Mount Graham (Tucson, Arizona) is an innovative project undertaken by institutions from Europe and USA. LINC-NIRVANA is an instrument which provides MCAO (Multi-Conjugate Adaptive Optics) and interferometry, combining the light from the two 8.4m telescopes coherently. This configuration offers 23m-baseline optical resolution and the sensitivity of a 12m mirror, with a 2 arc-minute diffraction limited field of view. The integration, alignment and testing of such a big instrument requires a well-organized choreography and AIV planning which has been developed in a hierarchical way. The instrument is divided in largely independent systems, and all of them consist of various subsystems. Every subsystem integration ends with a verification test and an acceptance procedure. When a certain number of systems are finished and accepted, the instrument AIV phase starts. This hierarchical approach allows testing at early stages with simple setups. The philosophy is to have internally aligned subsystems to be integrated in the instrument optical path, and extrapolate to finally align the instrument to the Gregorian bent foci of the telescope. The alignment plan was successfully executed in Heidelberg at MPIA facilities, and now the instrument is being re-integrated at the LBT over a series of 11 campaigns along the year 2016. After its commissioning, the instrument will offer MCAO sensing with the LBT telescope. The interferometric mode will be implemented in a future update of the instrument. This paper focuses on the alignment done in the clean room at the LBT facilities for the collimator, camera, and High-layer Wavefront Sensor (HWS) during March and April 2016. It also summarizes the previous work done in preparation for shipping and arrival of the instrument to the telescope. Results are presented for every step, and a final section outlines the future work to be done in next runs until its final commissioning.

  14. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Webb, Robert C. [Texas A& M University; Kamon, Teruki [Texas A& M University; Toback, David [Texas A& M University; Safonov, Alexei [Texas A& M University; Dutta, Bhaskar [Texas A& M University; Dimitri, Nanopoulos [Texas A& M University; Pope, Christopher [Texas A& M University; White, James [Texas A& M University

    2013-11-18

    Overview The High Energy Physics Group at Texas A&M University is submitting this final report for our grant number DE-FG02-95ER40917. This grant has supported our wide range of research activities for over a decade. The reports contained here summarize the latest work done by our research team. Task A (Collider Physics Program): CMS & CDF Profs. T. Kamon, A. Safonov, and D. Toback co-lead the Texas A&M (TAMU) collider program focusing on CDF and CMS experiments. Task D: Particle Physics Theory Our particle physics theory task is the combined effort of Profs. B. Dutta, D. Nanopoulos, and C. Pope. Task E (Underground Physics): LUX & NEXT Profs. R. Webb and J. White(deceased) lead the Xenon-based underground research program consisting of two main thrusts: the first, participation in the LUX two-phase xenon dark matter search experiment and the second, detector R&D primarily aimed at developing future detectors for underground physics (e.g. NEXT and LZ).

  15. EDITORIAL: International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification

    Science.gov (United States)

    Verhaegen, Frank; Seuntjens, Jan

    2008-03-01

    PhysicsWeb. At McGill we thank the following departments for support: the Cancer Axis of the Research Institute of the McGill University Health Center, the Faculties of Medicine and Science, the Departments of Oncology and Physics and the Medical Physics Unit. The following companies are thanked: TomoTherapy and Standard Imaging. The American Association of Physicists in Medicine and the International Atomic Energy Agency are gratefully acknowledged for endorsing the meeting. A final word of thanks goes out to all of those who contributed to the successful Workshop: first of all our administrative assistant Ms Margery Knewstubb, the website developer Dr François DeBlois, the two heads of the logistics team, Ms Emily Poon and Ms Emily Heath, our local medical physics students and staff, the IOP staff and the authors who shared their new and exciting work with us. Editors: Frank Verhaegen and Jan Seuntjens (McGill University) Associate editors: Luc Beaulieu, Iwan Kawrakow, Tony Popescu and David Rogers

  16. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  17. The use of the hybrid K-edge densitometer for routine analysis of safeguards verification samples of reprocessing input liquor

    International Nuclear Information System (INIS)

    Ottmar, H.; Eberle, H.

    1991-01-01

    Following successful tests of a hybrid K-edge instrument at TUI Karlsruhe and the routine use of a K-edge densitometer for safeguards verification at the same laboratory, the Euratom Safeguards Directorate of the Commission of the European Communities decided to install the first such instrument into a large industrial reprocessing plant for the routine verification of samples taken from the input accountancy tanks. This paper reports on the installation, calibration, sample handling procedure and the performance of this instrument after one year of routine operation

  18. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  19. Mapping and simulating systematics due to spatially varying observing conditions in DES science verification data

    International Nuclear Information System (INIS)

    Leistedt, B.; Peiris, H. V.; Elsner, F.; Benoit-Lévy, A.; Amara, A.

    2016-01-01

    Spatially varying depth and the characteristics of observing conditions, such as seeing, airmass, or sky background, are major sources of systematic uncertainties in modern galaxy survey analyses, particularly in deep multi-epoch surveys. We present a framework to extract and project these sources of systematics onto the sky, and apply it to the Dark Energy Survey (DES) to map the observing conditions of the Science Verification (SV) data. The resulting distributions and maps of sources of systematics are used in several analyses of DES–SV to perform detailed null tests with the data, and also to incorporate systematics in survey simulations. We illustrate the complementary nature of these two approaches by comparing the SV data with BCC-UFig, a synthetic sky catalog generated by forward-modeling of the DES–SV images. We analyze the BCC-UFig simulation to construct galaxy samples mimicking those used in SV galaxy clustering studies. We show that the spatially varying survey depth imprinted in the observed galaxy densities and the redshift distributions of the SV data are successfully reproduced by the simulation and are well-captured by the maps of observing conditions. The combined use of the maps, the SV data, and the BCC-UFig simulation allows us to quantify the impact of spatial systematics on N(z), the redshift distributions inferred using photometric redshifts. We conclude that spatial systematics in the SV data are mainly due to seeing fluctuations and are under control in current clustering and weak-lensing analyses. However, they will need to be carefully characterized in upcoming phases of DES in order to avoid biasing the inferred cosmological results. Finally, the framework presented here is relevant to all multi-epoch surveys and will be essential for exploiting future surveys such as the Large Synoptic Survey Telescope, which will require detailed null tests and realistic end-to-end image simulations to correctly interpret the deep, high

  20. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  1. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  2. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  3. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  4. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  5. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  6. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  7. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  8. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  9. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  10. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  11. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  12. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  13. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  14. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  15. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  16. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  17. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  18. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  19. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  20. VERA-CS Verification & Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Downar, Thomas [Univ. of Michigan, Ann Arbor, MI (United States)

    2017-02-01

    This report summarizes the current status of VERA-CS Verification and Validation for PWR Core Follow operation and proposes a multi-phase plan for continuing VERA-CS V&V in FY17 and FY18. The proposed plan recognizes the hierarchical nature of a multi-physics code system such as VERA-CS and the importance of first achieving an acceptable level of V&V on each of the single physics codes before focusing on the V&V of the coupled physics solution. The report summarizes the V&V of each of the single physics codes systems currently used for core follow analysis (ie MPACT, CTF, Multigroup Cross Section Generation, and BISON / Fuel Temperature Tables) and proposes specific actions to achieve a uniformly acceptable level of V&V in FY17. The report also recognizes the ongoing development of other codes important for PWR Core Follow (e.g. TIAMAT, MAMBA3D) and proposes Phase II (FY18) VERA-CS V&V activities in which those codes will also reach an acceptable level of V&V. The report then summarizes the current status of VERA-CS multi-physics V&V for PWR Core Follow and the ongoing PWR Core Follow V&V activities for FY17. An automated procedure and output data format is proposed for standardizing the output for core follow calculations and automatically generating tables and figures for the VERA-CS Latex file. A set of acceptance metrics is also proposed for the evaluation and assessment of core follow results that would be used within the script to automatically flag any results which require further analysis or more detailed explanation prior to being added to the VERA-CS validation base. After the Automation Scripts have been completed and tested using BEAVRS, the VERA-CS plan proposes the Watts Bar cycle depletion cases should be performed with the new cross section library and be included in the first draft of the new VERA-CS manual for release at the end of PoR15. Also, within the constraints imposed by the proprietary nature of plant data, as many as possible of the FY17

  1. Advancing the Fork detector for quantitative spent nuclear fuel verification

    Science.gov (United States)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  2. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  3. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  4. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  5. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  6. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  7. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  8. Radiological verification survey results at 14 Peck Ave., Pequannock, New Jersey (PJ001V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The U.S. Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W. R. Grace facility. The property at 14 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 14 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  9. Radiological verification survey results at 7 Peck Ave., Pequannock, New Jersey (PJ003V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 7 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 7 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  10. Radiological verification survey results as 13 Peck Ave., Pequannock, New Jersey (PJ004V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 13 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 13 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  11. Radiological verification survey results at 898 Black Oak Ridge Rd., Wayne, New Jersey (WJ004V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 898 Black Oak Ridge Road, Wayne, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at one meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 898 Black Oak Ridge Road were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  12. Radiological verification survey results at the Pompton Plains Railroad Spur, Pequannock, New Jersey (PJ008V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains railroad spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at the Pompton Plains Railroad Spur, Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at the Pompton Plains railroad spur were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  13. Radiological verification survey results at 3 Peck Ave., Pequannock, New Jersey (PJ002V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 3 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 3 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  14. Radiological verification survey results at 15 Peck Ave., Pequannock, New Jersey (PJ005V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 15 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 15 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  15. Radiological verification survey results at 17 Peck Ave., Pequannock, New Jersey (PJ006V)

    International Nuclear Information System (INIS)

    Rodriguez, R.E.; Johnson, C.A.

    1995-05-01

    The US Department of Energy (DOE) conducted remedial action during 1993 at the Pompton Plains Railroad Spur and eight vicinity properties in the Wayne and Pequannock Townships in New Jersey as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). These properties are in the vicinity of the DOE-owned Wayne Interim Storage Site (WISS), formerly the W.R. Grace facility. The property at 17 Peck Ave., Pequannock, New Jersey is one of these vicinity properties. At the request of DOE, a team from Oak Ridge National Laboratory conducted an independent radiological verification survey at this property. The purpose of the survey, conducted between September and December 1993, was to confirm the success of the remedial actions performed to remove any radioactive materials in excess of the identified guidelines. The verification survey included surface gamma scans and gamma readings at 1 meter, beta-gamma scans, and the collection of soil and debris samples for radionuclide analysis. Results of the survey demonstrated that all radiological measurements on the property at 17 Peck Ave. were within applicable DOE guidelines. Based on the results of the remedial action data and confirmed by the verification survey data, the portions of the site that had been remediated during this action successfully meet the DOE remedial action objectives

  16. Success in geothermal development

    International Nuclear Information System (INIS)

    Stefansson, V.

    1992-01-01

    Success in geothermal development can be defined as the ability to produce geothermal energy at compatible energy prices to other energy sources. Drilling comprises usually the largest cost in geothermal development, and the results of drilling is largely influencing the final price of geothermal energy. For 20 geothermal fields with operating power plants, the ratio between installed capacity and the total number of well in the field is 1.9 MWe/well. The drilling history in 30 geothermal fields are analyzed by plotting the average cumulative well outputs as function of the number of wells drilled in the field. The range of the average well output is 1-10 MWe/well with the mean value 4.2 MWe/well for the 30 geothermal fields studied. A leaning curve is defined as the number of wells drilled in each field before the average output per well reaches a fairly constant value, which is characteristic for the geothermal reservoir. The range for this learning time is 4-36 wells and the average is 13 wells. In general, the average well output in a given field is fairly constant after some 10-20 wells has been drilled in the field. The asymptotic average well output is considered to be a reservoir parameter when it is normalized to the average drilling depth. In average, this reservoir parameter can be expressed as 3.3 MWe per drilled km for the 30 geothermal fields studied. The lifetime of the resource or the depletion time of the geothermal reservoir should also be considered as a parameter influencing the success of geothermal development. Stepwise development, where the reservoir response to the utilization for the first step is used to determine the timing of the installment of the next step, is considered to be an appropriate method to minimize the risk for over investment in a geothermal field

  17. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  18. Educational Attainment: Success to the Successful

    Science.gov (United States)

    Anthony, Peter; Gould, David; Smith, Gina

    2013-01-01

    Systems archetypes are patterns of structure found in systems that are helpful in understanding some of the dynamics within them. The intent of this study was to examine educational attainment data using the success-to-the-successful archetype as a model to see if it helps to explain the inequality observed in the data. Data covering 1990 to 2009…

  19. College Success Courses: Success for All

    Science.gov (United States)

    Coleman, Sandra Lee; Skidmore, Susan Troncoso; Weller, Carol Thornton

    2018-01-01

    College success courses (CSCs), or orientation courses, are offered by community colleges and universities to facilitate the success of first-time-in-college students. Primarily, these courses are designed to address students' nonacademic deficiencies, such as weak study habits and poor organizational skills, and to familiarize students with…

  20. Numerical verification of composite rods theory on multi-story buildings analysis

    Science.gov (United States)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  1. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  2. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  3. Complex-Wide Waste Flow Analysis V1.0 verification and validation report

    International Nuclear Information System (INIS)

    Hsu, K.M.; Lundeen, A.S.; Oswald, K.B.; Shropshire, D.E.; Robinson, J.M.; West, W.H.

    1997-01-01

    The complex-wide waste flow analysis model (CWWFA) was developed to assist the Department of Energy (DOE) Environmental Management (EM) Office of Science and Technology (EM-50) to evaluate waste management scenarios with emphasis on identifying and prioritizing technology development opportunities to reduce waste flows and public risk. In addition, the model was intended to support the needs of the Complex-Wide Environmental Integration (EMI) team supporting the DOE's Accelerating Cleanup: 2006 Plan. CWWFA represents an integrated environmental modeling system that covers the life cycle of waste management activities including waste generation, interim process storage, retrieval, characterization and sorting, waste preparation and processing, packaging, final interim storage, transport, and disposal at a final repository. The CWWFA shows waste flows through actual site-specific and facility-specific conditions. The system requirements for CWWFA are documented in the Technical Requirements Document (TRD). The TRD is intended to be a living document that will be modified over the course of the execution of CWWFA development. Thus, it is anticipated that CWWFA will continue to evolve as new requirements are identified (i.e., transportation, small sites, new streams, etc.). This report provides a documented basis for system verification of CWWFA requirements. System verification is accomplished through formal testing and evaluation to ensure that all performance requirements as specified in the TRD have been satisfied. A Requirement Verification Matrix (RVM) was used to map the technical requirements to the test procedures. The RVM is attached as Appendix A. Since February of 1997, substantial progress has been made toward development of the CWWFA to meet the system requirements. This system verification activity provides a baseline on system compliance to requirements and also an opportunity to reevaluate what requirements need to be satisfied in FY-98

  4. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  5. First Images from VLT Science Verification Programme

    Science.gov (United States)

    1998-09-01

    Two Weeks of Intensive Observations Successfully Concluded After a period of technical commissioning tests, the first 8.2-m telescope of the ESO VLT (UT1) has successfully performed an extensive series of "real science" observations , yielding nearly 100 hours of precious data. They concern all possible types of astronomical objects, from distant galaxies and quasars to pulsars, star clusters and solar system objects. This intensive Science Verification (SV) Programme took place as planned from August 17 to September 1, 1998, and was conducted by the ESO SV Team at the VLT Observatory on Paranal (Chile) and at the ESO Headquarters in Garching (Germany). The new giant telescope lived fully up to the high expectations and worked with spectacular efficiency and performance through the entire period. All data will be released by September 30 via the VLT archive and the web (with some access restrictions - see below). The Science Verification period Just before the beginning of the SV period, the 8.2-m primary mirror in its cell was temporarily removed in order to install the "M3 tower" with the tertiary mirror [1]. The reassembly began on August 15 and included re-installation at the Cassegrain focus of the VLT Test Camera that was also used for the "First Light" images in May 1998. After careful optical alignment and various system tests, the UT1 was handed over to the SV Team on August 17 at midnight local time. The first SV observations began immediately thereafter and the SV Team was active 24 hours a day throughout the two-week period. Video-conferences between Garching and Paranal took place every day at about noon Garching time (6 o'clock in the morning on Paranal). Then, while the Paranal observers were sleeping, data from the previous night were inspected and reduced in Garching, with feedback on what was best to do during the following night being emailed to Paranal several hours in advance of the beginning of the observations. The campaign ended in the

  6. Attitudes of Success.

    Science.gov (United States)

    Pendarvis, Faye

    This document investigates the attitudes of successful individuals, citing the achievement of established goals as the criteria for success. After offering various definitions of success, the paper focuses on the importance of self-esteem to success and considers ways by which the self-esteem of students can be improved. Theories of human behavior…

  7. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  8. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  9. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  10. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  11. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  12. Final report on Weeks Island Monitoring Phase : 1999 through 2004.

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L.; Munson, Darrell Eugene

    2005-05-01

    This Final Report on the Monitoring Phase of the former Weeks Island Strategic Petroleum Reserve crude oil storage facility details the results of five years of monitoring of various surface accessible quantities at the decommissioned facility. The Weeks Island mine was authorized by the State of Louisiana as a Strategic Petroleum Reserve oil storage facility from 1979 until decommissioning of the facility in 1999. Discovery of a sinkhole over the facility in 1992 with freshwater inflow to the facility threatened the integrity of the oil storage and led to the decision to remove the oil, fill the chambers with brine, and decommission the facility. Thereafter, a monitoring phase, by agreement between the Department of Energy and the State, addressed facility stability and environmental concerns. Monitoring of the surface ground water and the brine of the underground chambers from the East Fill Hole produced no evidence of hydrocarbon contamination, which suggests that any unrecovered oil remaining in the underground chambers has been contained. Ever diminishing progression of the initial major sinkhole, and a subsequent minor sinkhole, with time was verification of the response of sinkholes to filling of the facility with brine. Brine filling of the facility ostensively eliminates any further growth or new formation from freshwater inflow. Continued monitoring of sinkhole response, together with continued surface surveillance for environmental problems, confirmed the intended results of brine pressurization. Surface subsidence measurements over the mine continued throughout the monitoring phase. And finally, the outward flow of brine was monitored as a measure of the creep closure of the mine chambers. Results of each of these monitoring activities are presented, with their correlation toward assuring the stability and environmental security of the decommissioned facility. The results suggest that the decommissioning was successful and no contamination of the

  13. Machine learning techniques for the verification of refueling activities in CANDU-type nuclear power plants (NPPs) with direct applications in nuclear safeguards

    International Nuclear Information System (INIS)

    Budzinski, J.

    2006-06-01

    This dissertation deals with the problem of automated classification of the signals obtained from certain radiation monitoring systems, specifically from the Core Discharge Monitor (CDM) systems, that are successfully operated by the International Atomic Energy Agency (IAEA) at various CANDU-type nuclear power plants around the world. In order to significantly reduce the costly and error-prone manual evaluation of the large amounts of the collected CDM signals, a reliable and efficient algorithm for the automated data evaluation is necessary, which might ensure real-time performance with maximum of 0.01 % misclassification ratio. This thesis describes the research behind finding a successful prototype implementation of such automated analysis software. The finally adopted methodology assumes a nonstationary data-generating process that has a finite number of states or basic fueling activities, each of which can emit observable data patterns having particular stationary characteristics. To find out the underlying state sequences, a unified probabilistic approach known as the hidden Markov model (HMM) is used. Each possible fueling sequence is modeled by a distinct HMM having a left-right profile topology with explicit insert and delete states. Given an unknown fueling sequence, a dynamic programming algorithm akin to the Viterbi search is used to find the maximum likelihood state path through each model and eventually the overall best-scoring path is picked up as the recognition hypothesis. Machine learning techniques are applied to estimate the observation densities of the states, because the densities are not simply parameterizable. Unlike most present applications of continuous monitoring systems that rely on heuristic approaches to the recognition of possibly risky events, this research focuses on finding techniques that make optimal use of prior knowledge and computer simulation in the recognition task. Thus, a suitably modified, approximate n-best variant of

  14. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  15. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  17. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  18. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  19. Turf Conversion Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings as a result of water conservation measures (WCMs) in energy performance contracts associated with converting turfgrass or other water-intensive plantings to water-wise and sustainable landscapes. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  20. Outdoor Irrigation Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.

  1. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  2. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  3. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  4. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  5. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  6. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  7. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  8. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status

  9. Advanced Technologies for Design Information Verification

    International Nuclear Information System (INIS)

    Watkins, Michael L.; Sheen, David M.; Rose, Joseph L.; Cumblidge, Stephen E.

    2009-01-01

    This paper discusses several technologies that have the potential to enhance facilities design verification. These approaches have shown promise in addressing the challenges associated with the verification of sub-component geometry and material composition for structures that are not directly accessible for physical inspection. A simple example is a pipe that extends into or through a wall or foundation. Both advanced electromagnetic and acoustic modalities will be discussed. These include advanced radar imaging, transient thermographic imaging, and guided acoustic wave imaging. Examples of current applications are provided. The basic principles and mechanisms of these inspection techniques are presented along with the salient practical features, advantages, and disadvantages of each technique. Other important considerations, such as component geometries, materials, and degree of access are also treated. The importance of, and strategies for, developing valid inspection models are also discussed. Beyond these basic technology adaptation and evaluation issues, important user interface considerations are outlined, along with approaches to quantify the overall performance reliability of the various inspection methods.

  10. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  11. Automatic quality verification of the TV sets

    Science.gov (United States)

    Marijan, Dusica; Zlokolica, Vladimir; Teslic, Nikola; Pekovic, Vukota; Temerinac, Miodrag

    2010-01-01

    In this paper we propose a methodology for TV set verification, intended for detecting picture quality degradation and functional failures within a TV set. In the proposed approach we compare the TV picture captured from a TV set under investigation with the reference image for the corresponding TV set in order to assess the captured picture quality and therefore, assess the acceptability of TV set quality. The methodology framework comprises a logic block for designing the verification process flow, a block for TV set quality estimation (based on image quality assessment) and a block for generating the defect tracking database. The quality assessment algorithm is a full-reference intra-frame approach which aims at detecting various digital specific-TV-set picture degradations, coming from TV system hardware and software failures, and erroneous operational modes and settings in TV sets. The proposed algorithm is a block-based scheme which incorporates the mean square error and a local variance between the reference and the tested image. The artifact detection algorithm is shown to be highly robust against brightness and contrast changes in TV sets. The algorithm is evaluated by performance comparison with the other state-of-the-art image quality assessment metrics in terms of detecting TV picture degradations, such as illumination and contrast change, compression artifacts, picture misalignment, aliasing, blurring and other types of degradations that are due to defects within the TV set video chain.

  12. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  13. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  14. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  15. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  16. Clinical verification in homeopathy and allergic conditions.

    Science.gov (United States)

    Van Wassenhoven, Michel

    2013-01-01

    The literature on clinical research in allergic conditions treated with homeopathy includes a meta-analysis of randomised controlled trials (RCT) for hay fever with positive conclusions and two positive RCTs in asthma. Cohort surveys using validated Quality of Life questionnaires have shown improvement in asthma in children, general allergic conditions and skin diseases. Economic surveys have shown positive results in eczema, allergy, seasonal allergic rhinitis, asthma, food allergy and chronic allergic rhinitis. This paper reports clinical verification of homeopathic symptoms in all patients and especially in various allergic conditions in my own primary care practice. For preventive treatments in hay fever patients, Arsenicum album was the most effective homeopathic medicine followed by Nux vomica, Pulsatilla pratensis, Gelsemium, Sarsaparilla, Silicea and Natrum muriaticum. For asthma patients, Arsenicum iodatum appeared most effective, followed by Lachesis, Calcarea arsenicosa, Carbo vegetabilis and Silicea. For eczema and urticaria, Mezereum was most effective, followed by Lycopodium, Sepia, Arsenicum iodatum, Calcarea carbonica and Psorinum. The choice of homeopathic medicine depends on the presence of other associated symptoms and 'constitutional' features. Repertories should be updated by including results of such clinical verifications of homeopathic prescribing symptoms. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  17. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  18. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  19. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  20. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  1. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...

  2. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  3. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  4. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  5. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  6. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  7. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  8. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  9. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  10. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  11. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  12. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  13. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  14. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  15. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  16. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Science.gov (United States)

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  17. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2016-06-01

    Full Text Available In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  18. Verification and validation of software related to nuclear power plant control and instrumentation

    International Nuclear Information System (INIS)

    Wall, N.; Kossilov, A.

    1994-01-01

    There has always been significant concern with introduction of software in industry and the nuclear industry is no different from any other sector save its safety demands are some of the most onerous. The problems associated with software have led to the well documented difficulties in the introduction of computer based systems. An important area of concern with software in systems is the processes of Verification and Validation. One of the many activities the IAEA is currently engaged in is the preparation of a document on the process of verification and validation of software. The document follows the safety classification of IEC 1226 but includes software important to plant operation to establish three levels of assurance. The software that might be deployed on a plant was then identified as one of four types: new software, existing software for which full access to the code and documentation is possible, existing software of a proprietary nature and finally configurable software. The document attempts to identify the appropriate methods and tools for conducting the verification and validation processes. (author). 5 refs, 5 figs, 7 tabs

  19. Evaluation of GafChromic EBT prototype B for external beam dose verification

    International Nuclear Information System (INIS)

    Todorovic, M.; Fischer, M.; Cremers, F.; Thom, E.; Schmidt, R.

    2006-01-01

    The capability of the new GafChromic EBT prototype B for external beam dose verification is investigated in this paper. First the general characteristics of this film (dose response, postirradiation coloration, influence of calibration field size) were derived using a flat-bed scanner. In the dose range from 0.1 to 8 Gy, the sensitivity of the EBT prototype B film is ten times higher than the response of the GafChromic HS, which so far was the GafChromic film with the highest sensitivity. Compared with the Kodak EDR2 film, the response of the EBT is higher by a factor of 3 in the dose range from 0.1 to 8 Gy. The GafChromic EBT almost does not show a temporal growth of the optical density and there is no influence of the chosen calibration field size on the dose response curve obtained from this data. A MatLab program was written to evaluate the two-dimensional dose distributions from treatment planning systems and GafChromic EBT film measurements. Verification of external beam therapy (SRT, IMRT) using the above-mentioned approach resulted in very small differences between the planned and the applied dose. The GafChromic EBT prototype B together with the flat-bed scanner and MatLab is a successful approach for making the advantages of the GafChromic films applicable for verification of external beam therapy

  20. Success in Science, Success in Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Mariann R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-25

    This is a series of four different scientific problems which were resolved through collaborations. They are: "Better flow cytometry through novel focusing technology", "Take Off®: Helping the Agriculture Industry Improve the Viability of Sustainable, Large-Production Crops", "The National Institutes of Health's Models of Infectious Disease Agent Study (MIDAS)", and "Expanding the capabilities of SOLVE/RESOLVE through the PHENIX Consortium." For each one, the problem is listed, the solution, advantages, bottom line, then information about the collaboration including: developing the technology, initial success, and continued success.

  1. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  2. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  3. Architecting Fault Tolerance with Exception Handling: Verification and Validation

    Institute of Scientific and Technical Information of China (English)

    Patrick H. S. Brito; Rogério de Lemos; Cecília M. F. Rubira; Eliane Martins

    2009-01-01

    When building dependable systems by integrating untrusted software components that were not originally designed to interact with each other, it is likely the occurrence of architectural mismatches related to assumptions in their failure behaviour. These mismatches, if not prevented during system design, have to be tolerated during runtime. This paper presents an architectural abstraction based on exception handling for structuring fault-tolerant software systems.This abstraction comprises several components and connectors that promote an existing untrusted software element into an idealised fault-tolerant architectural element. Moreover, it is considered in the context of a rigorous software development approach based on formal methods for representing the structure and behaviour of the software architecture. The proposed approach relies on a formal specification and verification for analysing exception propagation, and verifying important dependability properties, such as deadlock freedom, and scenarios of architectural reconfiguration. The formal models are automatically generated using model transformation from UML diagrams: component diagram representing the system structure, and sequence diagrams representing the system behaviour. Finally, the formal models are also used for generating unit and integration test cases that are used for assessing the correctness of the source code. The feasibility of the proposed architectural approach was evaluated on an embedded critical case study.

  4. Weak Lensing by Galaxy Troughs in DES Science Verification Data

    Energy Technology Data Exchange (ETDEWEB)

    Gruen, D. [Ludwig Maximilian Univ., Munich (Germany); Max Planck Inst. for Extraterrestrial Physics, Garching (Germany). et al.

    2015-09-29

    We measure the weak lensing shear around galaxy troughs, i.e. the radial alignment of background galaxies relative to underdensities in projections of the foreground galaxy field over a wide range of redshift in Science Verification data from the Dark Energy Survey. Our detection of the shear signal is highly significant (10σ–15σ for the smallest angular scales) for troughs with the redshift range z ϵ [0.2, 0.5] of the projected galaxy field and angular diameters of 10 arcmin…1°. These measurements probe the connection between the galaxy, matter density, and convergence fields. By assuming galaxies are biased tracers of the matter density with Poissonian noise, we find agreement of our measurements with predictions in a fiducial Λ cold dark matter model. Furthermore, the prediction for the lensing signal on large trough scales is virtually independent of the details of the underlying model for the connection of galaxies and matter. Our comparison of the shear around troughs with that around cylinders with large galaxy counts is consistent with a symmetry between galaxy and matter over- and underdensities. In addition, we measure the two-point angular correlation of troughs with galaxies which, in contrast to the lensing signal, is sensitive to galaxy bias on all scales. Finally, the lensing signal of troughs and their clustering with galaxies is therefore a promising probe of the statistical properties of matter underdensities and their connection to the galaxy field.

  5. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  6. A Compton Imaging Prototype for Range Verification in Particle Therapy

    International Nuclear Information System (INIS)

    Golnik, C.; Hueso Gonzalez, F.; Kormoll, T.; Pausch, G.; Rohling, H.; Fiedler, F.; Heidel, K.; Schoene, S.; Sobiella, M.; Wagner, A.; Enghardt, W.

    2013-06-01

    During the 2012 AAPM Annual Meeting 33 percent of the delegates considered the range uncertainty in proton therapy as the main obstacle of becoming a mainstream treatment modality. Utilizing prompt gamma emission, a side product of particle tissue interaction, opens the possibility of in-beam dose verification, due to the direct correlation between prompt gamma emission and particle dose deposition. Compton imaging has proven to be a technique to measure three dimensional gamma emission profiles and opens the possibility of adaptive dose monitoring and treatment correction. We successfully built a Compton Imaging prototype, characterized the detectors and showed the imaging capability of the complete device. The major advantage of CZT detectors is the high energy resolution and the high spatial resolution, which are key parameters for Compton Imaging. However, our measurements at the proton beam accelerator facility KVI in Groningen (Netherlands) disclosed a spectrum of prompt gamma rays under proton irradiation up to 4.4 MeV. As CZT detectors of 5 mm thickness do not efficiently absorb photons in such energy ranges, another absorption, based on a Siemens LSO block detector is added behind CZT1. This setup provides a higher absorption probability of high energy photons. With a size of 5.2 cm x 5.2 cm x 2.0 cm, this scintillation detector further increases the angular acceptance of Compton scattered photons due to geometric size. (authors)

  7. Evaluation verification facilities (EVF) at MINT: concept and implementation

    International Nuclear Information System (INIS)

    Mohamed Hairul Hasmoni; Abd Nassir Ibrahim; Ab Razak Hamzah

    2003-01-01

    EVF facilities and components available are described comprehensively. Objective of establishing EVF as a National Centre for non-destructive testing (NDT) are discussed for various activities of method and equipment validation, R and D on quantitative NDT technique, training and certification, and defect characterization. For a successful activity available at EVF, it is vital that industry participates through input of funding, sponsorship and knowledge sharing. The Malaysian Institute for Nuclear Technology Research (MINT) invested a lot in this facility and ready to share this facility under various mechanisms such as memorandum of understanding (MOU), memorandum of agreement (MOA), contract research or letter of agreement. The facility would be open to industry. Member of NDT community are welcomed to conduct trial and discuss particular areas of interest with others in the industry. Optimising the facility by utilising the facility available and adding new components would make EVF a national centre for NDT and centre of excellence. This paper reviews the concept and implementation of an Evaluation Verification Facility (EVF) at MINT. The types and designs of facilities available are described and characterised by usage NDT. (Author)

  8. A Cherenkov viewing device for used-fuel verification

    International Nuclear Information System (INIS)

    Attas, E.M.; Chen, J.D.; Young, G.J.

    1990-01-01

    A Cherenkov viewing device (CVD) has been developed to help verify declared inventories of used nuclear fuel stored in water bays. The device detects and amplifies the faint ultraviolet Cherenkov glow from the water surrounding the fuel, producing a real-time visible image on a phosphor screen. Quartz optics, a UV-pass filter and a microchannel-plate image-intensifier tube serve to form the image, which can be photographed or viewed directly through an eyepiece. Normal fuel bay lighting does not interfere with the Cherenkov light image. The CVD has been successfully used to detect anomalous PWR, BWR and CANDU (CANada Deuterium Uranium: registered trademark) fuel assemblies in the presence of normal-burnup assemblies stored in used-fuel bays. The latest version of the CVD, known as Mark IV, is being used by inspectors from the International Atomic Energy agency for verification of light-water power-reactor fuel. Its design and operation are described, together with plans for further enhancements of the instrumentation. (orig.)

  9. Cadmium verification measurements of HFIR shroud assembly 22

    International Nuclear Information System (INIS)

    Chapman, J.A.; Schultz, F.J.

    1994-04-01

    This report discusses radiation-based nondestructive examination methods which have been used to successfully verify the presence of cadmium in High Flux Isotope Reactor (HFIR) spent-fuel shroud assembly number 22 (SA22). These measurements show, in part, that SA22 is certified to meet the criticality safety specifications for a proposed reconfiguration of the HFIR spent-fuel storage array. Measurement of the unique 558.6-keV gamma-ray from neutron radiative capture on cadmium provided conclusive evidence for the presence of cadmium in the outer shroud of the assembly. Cadmium verification in the center post and outer shroud was performed by measuring the degree of neutron transmission in SA22 relative to two calibration shroud assemblies. Each measurement was performed at a single location on the center post and outer shroud. These measurements do not provide information on the spatial distribution or uniformity of cadmium within an assembly. Separate measurements using analog and digital radiography were performed to (a) globally map the continuity of cadmium internal mass, and (b) locally determine the thickness of cadmium. Radiography results will be reported elsewhere. The measurements reported here should not be used to infer the thickness of cadmium in either the center post or outer shroud of an assembly

  10. Triple Modular Redundancy verification via heuristic netlist analysis

    Directory of Open Access Journals (Sweden)

    Giovanni Beltrame

    2015-08-01

    Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.

  11. The Project of Success

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    more complicated matter than meeting targets. While success may ultimately be justified in terms of a correspondence between aims and achievements, the understanding of both aspects is highly dependent on the project process. An example of a successful project that did not meet the original performance...... targets will serve to show that success is at matter of perspective as much as it is a matter of achievement. Other types of research, e.g. social psychology, have addressed the issue of success more explicitly. I draw on such literature to conceptualize project success anew and to reestablish...

  12. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  13. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    Science.gov (United States)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  14. Verification of the uncertainty principle by using diffraction of light waves

    International Nuclear Information System (INIS)

    Nikolic, D; Nesic, Lj

    2011-01-01

    We described a simple idea for experimental verification of the uncertainty principle for light waves. We used a single-slit diffraction of a laser beam for measuring the angular width of zero-order diffraction maximum and obtained the corresponding wave number uncertainty. We will assume that the uncertainty in position is the slit width. For the acquisition of the experimental data and their further analysis, we used a computer. Because of its simplicity this experiment is very suitable for demonstration, as well as for a quantitative exercise at universities and final year of high school studies.

  15. Verification and validation of the safety parameter display system for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  16. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  17. Final Report of the Final Meeting of Project Coordinators

    International Nuclear Information System (INIS)

    Cordero Calderon, Carlos F.

    1996-06-01

    The Costa Rican Electricity Institute has always been worried of the verification of the good state of the works and thus to guarantee their operation. For that reason, it has established different sorts of auscultation of the Arenal's Dam. Some investigations have been done to find new methods to improve and to eliminate risks in different works or projects. The Arenal's Dam is one of the greatest engineering works in Costa Rica, it has the Arenal, Corobici and Sandillal Hydroelectric Plants. Furthermore, the irrigation system in the Tempisque River Valley, in the Guanacaste province. One special characteristic of the Site of the Dam, is the near location of the Arenal Volcano, in full activity and located at 6 Km. from the dam. This report has two goals, one is the traditional permanent measurements report for the project, and the other, is to present it as a final work of the Project Arcal XVIII, to the International Atomic Energy Agency. This report analyses the geo-hydraulic, structural and topographic auscultation, as well as the activities accomplished during the ARCAL XVIII /8/018, Application of Tracer Techniques for Leakage in Dams and Damming Project, based on information gathered through the geo-chemical auscultation, until June 1996. (author).30 ills., 80 charts, 35 tabs

  18. TomoTherapy MLC verification using exit detector data

    Energy Technology Data Exchange (ETDEWEB)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu [TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); Department of Radiation Oncology, University of Colorado School of Medicine, Aurora, Colorado 80045 (United States); Xinghua Cancer Hospital, Xinghua, Jiangsu 225700 (China); Department of Radiation Oncology, University of California-Los Angeles, Los Angeles, California 90095 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States)

    2012-01-15

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment

  19. TomoTherapy MLC verification using exit detector data

    International Nuclear Information System (INIS)

    Chen Quan; Westerly, David; Fang Zhenyu; Sheng, Ke; Chen Yu

    2012-01-01

    Purpose: Treatment delivery verification (DV) is important in the field of intensity modulated radiation therapy (IMRT). While IMRT and image guided radiation therapy (IGRT), allow us to create more conformal plans and enables the use of tighter margins, an erroneously executed plan can have detrimental effects on the treatment outcome. The purpose of this study is to develop a DV technique to verify TomoTherapy's multileaf collimator (MLC) using the onboard mega-voltage CT detectors. Methods: The proposed DV method uses temporal changes in the MVCT detector signal to predict actual leaf open times delivered on the treatment machine. Penumbra and scattered radiation effects may produce confounding results when determining leaf open times from the raw detector data. To reduce the impact of the effects, an iterative, Richardson-Lucy (R-L) deconvolution algorithm is applied. Optical sensors installed on each MLC leaf are used to verify the accuracy of the DV technique. The robustness of the DV technique is examined by introducing different attenuation materials in the beam. Additionally, the DV technique has been used to investigate several clinical plans which failed to pass delivery quality assurance (DQA) and was successful in identifying MLC timing discrepancies as the root cause. Results: The leaf open time extracted from the exit detector showed good agreement with the optical sensors under a variety of conditions. Detector-measured leaf open times agreed with optical sensor data to within 0.2 ms, and 99% of the results agreed within 8.5 ms. These results changed little when attenuation was added in the beam. For the clinical plans failing DQA, the dose calculated from reconstructed leaf open times played an instrumental role in discovering the root-cause of the problem. Throughout the retrospective study, it is found that the reconstructed dose always agrees with measured doses to within 1%. Conclusions: The exit detectors in the TomoTherapy treatment systems

  20. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  1. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  2. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  3. Security Protocols: Specification, Verification, Implementation, and Composition

    DEFF Research Database (Denmark)

    Almousa, Omar

    An important aspect of Internet security is the security of cryptographic protocols that it deploys. We need to make sure that such protocols achieve their goals, whether in isolation or in composition, i.e., security protocols must not suffer from any aw that enables hostile intruders to break...... results. The most important generalization is the support for all security properties of the geometric fragment proposed by [Gut14]....... their security. Among others, tools like OFMC [MV09b] and Proverif [Bla01] are quite efficient for the automatic formal verification of a large class of protocols. These tools use different approaches such as symbolic model checking or static analysis. Either approach has its own pros and cons, and therefore, we...

  4. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  5. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  6. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  7. Consortium for Verification Technology Fellowship Report.

    Energy Technology Data Exchange (ETDEWEB)

    Sadler, Lorraine E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-06-01

    As one recipient of the Consortium for Verification Technology (CVT) Fellowship, I spent eight days as a visiting scientist at the University of Michigan, Department of Nuclear Engineering and Radiological Sciences (NERS). During this time, I participated in multiple department and research group meetings and presentations, met with individual faculty and students, toured multiple laboratories, and taught one-half of a one-unit class on Risk Analysis in Nuclear Arms control (six 1.5 hour lectures). The following report describes some of the interactions that I had during my time as well as a brief discussion of the impact of this fellowship on members of the consortium and on me/my laboratory’s technical knowledge and network.

  8. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  9. Remedial activities effectiveness verification in tailing areas.

    Science.gov (United States)

    Kluson, J; Thinova, L; Neznal, M; Svoboda, T

    2015-06-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Remedial activities effectiveness verification in tailing areas

    International Nuclear Information System (INIS)

    Kluson, J.; Thinova, L.; Svoboda, T.; Neznal, M.

    2015-01-01

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. (authors)

  11. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  12. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  13. 300 Area Process Trenches Verification Package

    International Nuclear Information System (INIS)

    Lerch, J.A.

    1998-03-01

    The purpose of this verification package is to document achievement of the remedial action objectives for the 300 Area Process Trenches (300 APT) located within the 300-FF-1 Operable Unit (OU). The 300 APT became active in 1975 as a replacement for the North and South Process Pond system that is also part of the 300-FF-1 OU. The trenches received 300 Area process effluent from the uranium fuel fabrication facilities. Waste from the 300 Area laboratories that was determined to be below discharge limits based on monitoring performed at the 307 retention basin was also released to the trenches. Effluent flowed through the headworks sluice gates, down a concrete apron, and into the trenches. From the beginning of operations in 1975 until 1993, a continuous, composite sampler was located at the headwork structure to analyze process effluent at the point of discharge to the environment

  14. Status and verification strategy for ITER neutronics

    Energy Technology Data Exchange (ETDEWEB)

    Loughlin, Michael, E-mail: michael.loughlin@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Angelone, Maurizio [Associazione EURATOM-ENEA Sulla Fusione, Via E. Fermi 45, I-00044 Frascati, Roma (Italy); Batistoni, Paola [Associazione EURATOM-ENEA Sulla Fusione, Via E. Fermi 45, I-00044 Frascati, Roma (Italy); JET-EFDA, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Bertalot, Luciano [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Eskhult, Jonas [Studsvik Nuclear AB, SE-611 Nyköping (Sweden); Konno, Chikara [Japan Atomic Energy Agency Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan); Pampin, Raul [F4E Fusion for Energy, Josep Pla 2, Torres Diagonal Litoral B3, Barcelona 08019 (Spain); Polevoi, Alexei; Polunovskiy, Eduard [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2014-10-15

    The paper summarizes the current status of neutronics at ITER and a first set of proposals for experimental programmes to be conducted in the early operational life-time of ITER are described for the more crucial areas. These include a TF coils heating benchmark, a streaming benchmark and streaming measurements by activation on ITER itself. Also on ITER the measurement of activated water from triton burn-up should be planned and performed. This will require the measurement of triton burn-up in DD phase. Measurements of neutron flux in the tokamak building during DD operations should also be carried out. The use of JET for verification of shut down dose rate estimates is desirable. Other facilities to examine the production and behaviour of activated corrosion products and the shielding properties of concretes to high energy (6 MeV) gamma-rays are recommended.

  15. Verification of adolescent self-reported smoking.

    Science.gov (United States)

    Kentala, Jukka; Utriainen, Pekka; Pahkala, Kimmo; Mattila, Kari

    2004-02-01

    Smoking and the validity of information obtained on it is often questioned in view of the widespread belief that adolescents tend to under- or over-report the habit. The aim here was to verify smoking habits as reported in a questionnaire given in conjunction with dental examinations by asking participants directly whether they smoked or not and performing biochemical measurements of thiocyanate in the saliva and carbon monoxide in the expired air. The series consisted of 150 pupils in the ninth grade (age 15 years). The reports in the questionnaires seemed to provide a reliable estimate of adolescent smoking, the sensitivity of the method being 81-96%, specificity 77-95%. Biochemical verification or control of smoking proved needless in normal dental practice. Accepting information offered by the patient provides a good starting point for health education and work motivating and supporting of self-directed breaking of the habit.

  16. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  17. Computer Generated Inputs for NMIS Processor Verification

    International Nuclear Information System (INIS)

    J. A. Mullens; J. E. Breeding; J. A. McEvers; R. W. Wysor; L. G. Chiang; J. R. Lenarduzzi; J. T. Mihalczo; J. K. Mattingly

    2001-01-01

    Proper operation of the Nuclear Identification Materials System (NMIS) processor can be verified using computer-generated inputs [BIST (Built-In-Self-Test)] at the digital inputs. Preselected sequences of input pulses to all channels with known correlation functions are compared to the output of the processor. These types of verifications have been utilized in NMIS type correlation processors at the Oak Ridge National Laboratory since 1984. The use of this test confirmed a malfunction in a NMIS processor at the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF) in 1998. The NMIS processor boards were returned to the U.S. for repair and subsequently used in NMIS passive and active measurements with Pu at VNIIEF in 1999

  18. Formal Modeling and Verification for MVB

    Directory of Open Access Journals (Sweden)

    Mo Xia

    2013-01-01

    Full Text Available Multifunction Vehicle Bus (MVB is a critical component in the Train Communication Network (TCN, which is widely used in most of the modern train techniques of the transportation system. How to ensure security of MVB has become an important issue. Traditional testing could not ensure the system correctness. The MVB system modeling and verification are concerned in this paper. Petri Net and model checking methods are used to verify the MVB system. A Hierarchy Colored Petri Net (HCPN approach is presented to model and simulate the Master Transfer protocol of MVB. Synchronous and asynchronous methods are proposed to describe the entities and communication environment. Automata model of the Master Transfer protocol is designed. Based on our model checking platform M3C, the Master Transfer protocol of the MVB is verified and some system logic critical errors are found. Experimental results show the efficiency of our methods.

  19. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    and the environment behaves. Synthesis of strategies in games can thus be used for automatic generation of correct-by-construction programs from specifications. We consider verification and synthesis problems for several well-known game-based models. This includes both model-checking problems and satisfiability...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm...... corresponds directly to a program for the corresponding entity of the system. A strategy for a player which ensures that the player wins no matter how the other players behave then corresponds to a program ensuring that the specification of the entity is satisfied no matter how the other entities...

  20. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)