WorldWideScience

Sample records for background numerical verification

  1. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  2. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  3. Numerical Verification Methods for Spherical $t$-Designs

    OpenAIRE

    Chen, Xiaojun

    2009-01-01

    The construction of spherical $t$-designs with $(t+1)^2$ points on the unit sphere $S^2$ in $\\mathbb{R}^3$ can be reformulated as an underdetermined system of nonlinear equations. This system is highly nonlinear and involves the evaluation of a degree $t$ polynomial in $(t+1)^4$ arguments. This paper reviews numerical verification methods using the Brouwer fixed point theorem and Krawczyk interval operator for solutions of the underdetermined system of nonlinear equations...

  4. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  5. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  6. Determination of Solution Accuracy of Numerical Schemes as Part of Code and Calculation Verification

    Energy Technology Data Exchange (ETDEWEB)

    Blottner, F.G.; Lopez, A.R.

    1998-10-01

    This investigation is concerned with the accuracy of numerical schemes for solving partial differential equations used in science and engineering simulation codes. Richardson extrapolation methods for steady and unsteady problems with structured meshes are presented as part of the verification procedure to determine code and calculation accuracy. The local truncation error de- termination of a numerical difference scheme is shown to be a significant component of the veri- fication procedure as it determines the consistency of the numerical scheme, the order of the numerical scheme, and the restrictions on the mesh variation with a non-uniform mesh. Genera- tion of a series of co-located, refined meshes with the appropriate variation of mesh cell size is in- vestigated and is another important component of the verification procedure. The importance of mesh refinement studies is shown to be more significant than just a procedure to determine solu- tion accuracy. It is suggested that mesh refinement techniques can be developed to determine con- sistency of numerical schemes and to determine if governing equations are well posed. The present investigation provides further insight into the conditions and procedures required to effec- tively use Richardson extrapolation with mesh refinement studies to achieve confidence that sim- ulation codes are producing accurate numerical solutions.

  7. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  8. Numerical simulation code for combustion of sodium liquid droplet and its verification

    International Nuclear Information System (INIS)

    Okano, Yasushi

    1997-11-01

    The computer programs for sodium leak and burning phenomena had been developed based on mechanistic approach. Direct numerical simulation code for sodium liquid droplet burning had been developed for numerical analysis of droplet combustion in forced convection air flow. Distributions of heat generation and temperature and reaction rate of chemical productions, such as sodium oxide and hydroxide, are calculated and evaluated with using this numerical code. Extended MAC method coupled with a higher-order upwind scheme had been used for combustion simulation of methane-air mixture. In the numerical simulation code for combustion of sodium liquid droplet, chemical reaction model of sodium was connected with the extended MAC method. Combustion of single sodium liquid droplet was simulated in this report for the verification of developed numerical simulation code. The changes of burning rate and reaction product with droplet diameter and inlet wind velocity were investigated. These calculation results were qualitatively and quantitatively conformed to the experimental and calculation observations in combustion engineering. It was confirmed that the numerical simulation code was available for the calculation of sodium liquid droplet burning. (author)

  9. Numerical verification of composite rods theory on multi-story buildings analysis

    Science.gov (United States)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  10. Combining Task Execution and Background Knowledge for the Verification of Medical Guidelines

    Science.gov (United States)

    Hommersom, Arjen; Groot, Perry; Lucas, Peter; Balser, Michael; Schmitt, Jonathan

    The use of a medical guideline can be seen as the execution of computational tasks, sequentially or in parallel, in the face of patient data. It has been shown that many of such guidelines can be represented as a 'network of tasks', i.e., as a number of steps that have a specific function or goal. To investigate the quality of such guidelines we propose a formalization of criteria for good practice medicine a guideline should comply to. We use this theory in conjunction with medical background knowledge to verify the quality of a guideline dealing with diabetes mellitus type 2 using the interactive theorem prover KIV. Verification using task execution and background knowledge is a novel approach to quality checking of medical guidelines.

  11. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  12. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  13. Verification and validation of a numeric procedure for flow simulation of a 2x2 PWR rod bundle

    International Nuclear Information System (INIS)

    Santos, Andre A.C.; Barros Filho, Jose Afonso; Navarro, Moyses A.

    2011-01-01

    Before Computational Fluid Dynamics (CFD) can be considered as a reliable tool for the analysis of flow through rod bundles there is a need to establish the credibility of the numerical results. Procedures must be defined to evaluate the error and uncertainty due to aspects such as mesh refinement, turbulence model, wall treatment and appropriate definition of boundary conditions. These procedures are referred to as Verification and Validation (V and V) processes. In 2009 a standard was published by the American Society of Mechanical Engineers (ASME) establishing detailed procedures for V and V of CFD simulations. This paper presents a V and V evaluation of a numerical methodology applied to the simulation of a PWR rod bundle segment with a split vane spacer grid based on ASMEs standard. In this study six progressively refined meshes were generated to evaluate the numerical uncertainty through the verification procedure. Experimental and analytical results available in the literature were used in this study for validation purpose. The results show that the ASME verification procedure can give highly variable predictions of uncertainty depending on the mesh triplet used for the evaluation. However, the procedure can give good insight towards optimization of the mesh size and overall result quality. Although the experimental results used for the validation were not ideal, through the validation procedure the deficiencies and strengths of the presented modeling could be detected and reasonably evaluated. Even though it is difficult to obtain reliable estimates of the uncertainty of flow quantities in the turbulent flow, this study shows that the V and V process is a necessary step in a CFD analysis of a spacer grid design. (author)

  14. Design and verification of controllers for longitudinal oscillations using optimal control theory and numerical simulation: Predictions for PEP-II

    International Nuclear Information System (INIS)

    Hindi, H.; Prabhakar, S.; Fox, J.; Teytelman, D.

    1997-12-01

    The authors present a technique for the design and verification of efficient bunch-by-bunch controllers for damping longitudinal multibunch instabilities. The controllers attempt to optimize the use of available feedback amplifier power--one of the most expensive components of a feedback system--and define the limits of closed loop system performance. The design technique alternates between analytic computation of single bunch optimal controllers and verification on a multibunch numerical simulator. The simulator identifies unstable coupled bunch modes and predicts their growth and damping rates. The results from the simulator are shown to be in reasonable agreement with analytical calculations based on the single bunch model. The technique is then used to evaluate the performance of a variety of controllers proposed for PEP-II

  15. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  16. Numerical verification of the theory of coupled reactors for a deuterium critical assembly using MCNP5

    International Nuclear Information System (INIS)

    Hussein, M.S.; Bonin, H.W.; Lewis, B.J.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as Deuterium Critical Assembly, (DCA). The variations of the criticality factors and the coupling coefficients were investigated by changing of the water levels in the inner and outer cores. The numerical results of the model developed with MCNP5 code were validated and verified against published results and the mathematical model based on coupled reactor theory. (author)

  17. Numerical verification of the theory of coupled reactors for a deuterium critical assembly using MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S.; Bonin, H.W.; Lewis, B.J., E-mail: mohamed.hussein@rmc.ca, E-mail: bonin-h@rmc.ca, E-mail: lewis-b@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as Deuterium Critical Assembly, (DCA). The variations of the criticality factors and the coupling coefficients were investigated by changing of the water levels in the inner and outer cores. The numerical results of the model developed with MCNP5 code were validated and verified against published results and the mathematical model based on coupled reactor theory. (author)

  18. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  19. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    International Nuclear Information System (INIS)

    Looe, H.K.; Uphoff, Y.; Poppe, B.; Carl von Ossietzky Univ., Oldenburg; Harder, D.; Willborn, K.C.

    2012-01-01

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  20. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    Energy Technology Data Exchange (ETDEWEB)

    Looe, H.K.; Uphoff, Y.; Poppe, B. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy; Carl von Ossietzky Univ., Oldenburg (Germany). WG Medical Radiation Physics; Harder, D. [Georg August Univ., Goettingen (Germany). Medical Physics and Biophysics; Willborn, K.C. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy

    2012-02-15

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  1. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  2. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  3. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  4. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  5. On the use of advanced numerical models for the evaluation of dosimetric parameters and the verification of exposure limits at workplaces.

    Science.gov (United States)

    Catarinucci, L; Tarricone, L

    2009-12-01

    With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory.

  6. On the use of advanced numerical models for the evaluation of dosimetric parameters and the verification of exposure limits at workplaces

    International Nuclear Information System (INIS)

    Catarinucci, L.; Tarricone, L.

    2009-01-01

    With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory. (authors)

  7. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  8. Numerical simulation and experimental verification of microstructure evolution in large forged pipe used for AP1000 nuclear power plants

    International Nuclear Information System (INIS)

    Wang, Shenglong; Yang, Bin; Zhang, Mingxian; Wu, Huanchun; Peng, Jintao; Gao, Yang

    2016-01-01

    Highlights: • Establish systematically the database of 316LN stainless steel for Deform-3D. • Simulate the microstructure evolution during forging of AP1000 primary coolant pipe. • Carry out full-scale forging experiment for verification in engineering practice. • Get desirable grain size in simulation and experiment. • The variation trends of grain sizes in simulation and experiment are consistent. - Abstract: AP1000 primary coolant pipe is a large special-shaped forged pipe made of 316LN stainless steel. Due to the non-uniform temperature and deformation during its forging, coarse and fine grains usually coexist in the forged pipe, resulting in the heterogeneous microstructure and anisotropic performance. To investigate the microstructure evolution during the entire forging process, in the present research, the database of the 316LN stainless steel was established and a numerical simulation was performed. The results indicate that the middle body section of the forged pipe has an extremely uniform average grain size with the value smaller than 30 μm. The grain sizes in the ends of body sections were ranged from 30 μm to 60 μm. Boss sections have relatively homogeneous microstructure with the average grain size 30 μm to 44 μm. Furthermore, a full-scale hot forging was carried out for verification. Comparison of theoretical and experimental results showed good agreement and hence demonstrated the capabilities of the numerical simulation presented here. It is noteworthy that all grains in the workpiece were confirmed less than 180 μm, which meets the designer’s demands.

  9. Verification of thermo-fluidic CVD reactor model

    International Nuclear Information System (INIS)

    Lisik, Z; Turczynski, M; Ruta, L; Raj, E

    2014-01-01

    Presented paper describes the numerical model of CVD (Chemical Vapour Deposition) reactor created in ANSYS CFX, whose main purpose is the evaluation of numerical approaches used to modelling of heat and mass transfer inside the reactor chamber. Verification of the worked out CVD model has been conducted with measurements under various thermal, pressure and gas flow rate conditions. Good agreement between experimental and numerical results confirms correctness of the elaborated model.

  10. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  11. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  12. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  13. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  14. Numerical simulation of turbulent flow and heat transfer in a parallel channel. Verification of the field synergy principle

    International Nuclear Information System (INIS)

    Tian Wenxi; Su, G.H.; Qiu Suizheng; Jia Dounan

    2004-01-01

    The field synergy principle was proposed by Guo(1998) which is based on 2-D boundary laminar flow and it resulted from a second look at the mechanism of convective heat transfer. Numerical verification of this principle's validity for turbulent flow has been carried out by very few researchers, and mostly commercial software such as FLUENT, CFX etc. were used in their study. In this paper, numerical simulation of turbulent flow with recirculation was developed using SIMPLE algorithm with two-equation k-ε model. Extension of computational region method and wall function method were quoted to regulate the whole computational region geometrically. Given the inlet Reynold number keeps constant: 10000, by changing the height of the solid obstacle, simulation was conducted and the result showed that the wall heat flux decreased with the angle between the velocity vector and the temperature gradient. Thus it is validated that the field synergy principle based on 2-D boundary laminar flow can also be applied to complex turbulent flow even with recirculation. (author)

  15. Fuzzy Verification of Lower Dimensional Information in a Numerical Simulation of Sea Ice

    Science.gov (United States)

    Sulsky, D.; Levy, G.

    2010-12-01

    Ideally, a verification and validation scheme should be able to evaluate and incorporate lower dimensional features (e.g., discontinuities) contained within a bulk simulation even when not directly observed or represented by model variables. Nonetheless, lower dimensional features are often ignored. Conversely, models that resolve such features and the associated physics well, yet imprecisely are penalized by traditional validation schemes. This can lead to (perceived or real) poor model performance and predictability and can become deleterious in model improvements when observations are sparse, fuzzy, or irregular. We present novel algorithms and a general framework for using information from available satellite data through fuzzy verification that efficiently and effectively remedy the known problems mentioned above. As a proof of concept, we use a sea-ice model with remotely sensed observations of leads in a one-step initialization cycle. Using the new scheme in a sixteen day simulation experiment introduces model skill (against persistence) several days earlier than in the control run, improves the overall model skill and delays its drop off at later stages of the simulation. Although sea-ice models are currently a weak link in climate models, the appropriate choice of data to use, and the fuzzy verification and evaluation of a system’s skill in reproducing lower dimensional features are important beyond the initial application to sea ice. Our strategy and framework for fuzzy verification, selective use of information, and feature extraction could be extended globally and to other disciplines. It can be incorporated in and complement existing verification and validation schemes, increasing their computational efficiency and the information they use. It can be used for model development and improvements, upscaling/downscaling models, and for modeling processes not directly represented by model variables or direct observations. Finally, if successful, it can

  16. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  17. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  18. Numerical simulation and experimental verification of gas flow through packed beds

    International Nuclear Information System (INIS)

    Natarajan, S.; Zhang, C.; Briens, C.

    2003-01-01

    This work is concerned with finding out an effective way of eliminating oxygen from a packed bed of monomer particles. This process finds application in industries involved in the manufacture of Nylon12. In the manufacture of the polymer Nylon12, the polymerization reaction is hindered by the presence of oxygen. Therefore, the main objective of this study is to get rid of the oxygen by injecting nitrogen to displace the oxygen from the voids in-between the monomer particles before they are introduced into the polymerization reactor. This work involves the numerical simulation and experimental verification of the flow in a packed bed. In addition, a parametric study is carried out for the parameters such as the number of injectors, the radial position of injectors, and the position of the injectors along the circumference of the packed bed to find out the best possible combination for effective elimination of the oxygen. Nitrogen does not interact with the monomer particles and hence there is no chemical reaction involved in this process. The nitrogen is introduced into the packed bed at a flow rate which will keep the superficial velocity well below the minimum fluidization velocity of the monomer particles. The packed bed will be modeled using a porous medium approach available in the commercial computational fluid dynamics (CFD) software FLUENT. The fluid flow inside the packed bed will be a multicomponent gas flow through a porous medium. The simulation results are validated by comparing with the experimental results. (author)

  19. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  20. Numerical modeling of plasma plume evolution against ambient background gas in laser blow off experiments

    International Nuclear Information System (INIS)

    Patel, Bhavesh G.; Das, Amita; Kaw, Predhiman; Singh, Rajesh; Kumar, Ajai

    2012-01-01

    Two dimensional numerical modelling based on simplified hydrodynamic evolution for an expanding plasma plume (created by laser blow off) against an ambient background gas has been carried out. A comparison with experimental observations shows that these simulations capture most features of the plasma plume expansion. The plume location and other gross features are reproduced as per the experimental observation in quantitative detail. The plume shape evolution and its dependence on the ambient background gas are in good qualitative agreement with the experiment. This suggests that a simplified hydrodynamic expansion model is adequate for the description of plasma plume expansion.

  1. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  2. Analytical and Numerical Studies of the Complex Interaction of a Fast Ion Beam Pulse with a Background Plasma

    International Nuclear Information System (INIS)

    Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2003-01-01

    Plasma neutralization of an intense ion beam pulse is of interest for many applications, including plasma lenses, heavy ion fusion, high energy physics, etc. Comprehensive analytical, numerical, and experimental studies are underway to investigate the complex interaction of a fast ion beam with a background plasma. The positively charged ion beam attracts plasma electrons, and as a result the plasma electrons have a tendency to neutralize the beam charge and current. A suite of particle-in-cell codes has been developed to study the propagation of an ion beam pulse through the background plasma. For quasi-steady-state propagation of the ion beam pulse, an analytical theory has been developed using the assumption of long charge bunches and conservation of generalized vorticity. The analytical results agree well with the results of the numerical simulations. The visualization of the data obtained in the numerical simulations shows complex collective phenomena during beam entry into and ex it from the plasma

  3. Numerical Modeling of Ablation Heat Transfer

    Science.gov (United States)

    Ewing, Mark E.; Laker, Travis S.; Walker, David T.

    2013-01-01

    A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.

  4. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  5. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, C. Chace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Niederhaus, John Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, Allen C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-29

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specialized approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.

  6. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  7. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  8. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  9. LHCb: Numerical Analysis of Machine Background in the LHCb Experiment for the Early and Nominal Operation of LHC

    CERN Multimedia

    Lieng, M H; Corti, G; Talanov, V

    2010-01-01

    We consider the formation of machine background induced by proton losses in the long straight section of the LHCb experiment at LHC. Both sources showering from the tertiary collimators located in the LHCb insertion region as well as local beam-gas interaction are taken into account. We present the procedure for, and results of, numerical studies of such background for various conditions. Additionally expected impact and on the experiment and signal characteristics are discussed.

  10. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  11. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    International Nuclear Information System (INIS)

    Hussein, M.S; Lewis, B.J.; Bonin, H.W.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k eff calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k eff calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k eff calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  12. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S, E-mail: mohamed.hussein@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada); Lewis, B.J., E-mail: Brent.Lewis@uoit.ca [Univ. of Ontario Inst. of Technology, Faculty of Energy Systems and Nuclear Science, Oshawa, Ontario (Canada); Bonin, H.W., E-mail: bonin-h@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k{sub eff} calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k{sub eff} calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k{sub eff} calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  13. Playing Linear Numerical Board Games Promotes Low-Income Children's Numerical Development

    Science.gov (United States)

    Siegler, Robert S.; Ramani, Geetha B.

    2008-01-01

    The numerical knowledge of children from low-income backgrounds trails behind that of peers from middle-income backgrounds even before the children enter school. This gap may reflect differing prior experience with informal numerical activities, such as numerical board games. Experiment 1 indicated that the numerical magnitude knowledge of…

  14. RECOGNITION AND VERIFICATION OF TOUCHING HANDWRITTEN NUMERALS

    NARCIS (Netherlands)

    Zhou, J.; Kryzak, A.; Suen, C.Y.

    2004-01-01

    In the field of financial document processing, recognition of touching handwritten numerals has been limited by lack of good benchmarking databases and low reliability of algorithms. This paper addresses the efforts toward solving the two problems. Two databases IRIS-Bell\\\\\\'98 and TNIST are

  15. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  16. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  17. Calibration And Performance Verification Of LSC Packard 1900TR AFTER REPAIRING

    International Nuclear Information System (INIS)

    Satrio; Evarista-Ristin; Syafalni; Alip

    2003-01-01

    Calibration process and repeated verification of LSC Packard 1900TR at Hydrology Section-P3TlR has been done. In the period of middle 1997 to July 2000, the counting system of the instrument has damaged and repaired for several times. After repairing, the system was recalibrated and then verified. The calibration and verification were conducted by using standard 3 H, 14 C and background unquenched. The result of calibration shows that background count rates of 3 H and 14 C is 12.3 ± 0.79 cpm and 18.24 ± 0.69 cpm respectively; FOM 3 H and 14 C is 285.03 ± 15.95 and 641.06 ± 16.45 respectively; 3 H and 14 C efficiency is 59.13 ± 0.28 % and 95.09 ± 0.31 %. respectively. From the verification data's, the parameter of SIS and tSIE for 14 C is to be in range of limit. And then 3 H and 14 C efficiency is still above minimum limit. Whereas, the background fluctuation still show normal condition. It could be concluded that until now the performance of LSC Packard 1900TR is well condition and could be used for counting. (author)

  18. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  19. Development of requirements tracking and verification technology for the NPP software

    International Nuclear Information System (INIS)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-01-01

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification

  20. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  1. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  2. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  3. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  4. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  5. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  6. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Irith De Baetselier

    2016-10-01

    Full Text Available Background: Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs. Verification of CRRs is time consuming and often requires a statistical background. Objectives: We report on an easy and cost-saving method to verify CRRs. Methods: Using a former method introduced by Sigma Diagnostics, three study sites in sub- Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results: Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion: To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  7. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  8. Experimental verification of numerical calculations of railway passenger seats

    Science.gov (United States)

    Ligaj, B.; Wirwicki, M.; Karolewska, K.; Jasińska, A.

    2018-04-01

    The construction of railway seats is based on industry regulations and the requirements of end users, i.e. passengers. The two main documents in this context are the UIC 566 (3rd Edition, dated 7 January 1994) and the EN 12663-1: 2010+A1:2014. The study was to carry out static load tests of passenger seat frames. The paper presents the construction of the test bench and the results of experimental and numerical studies of passenger seat rail frames. The test bench consists of a frame, a transverse beam, two electric cylinders with a force value of 6 kN, and a strain gauge amplifier. It has a modular structure that allows for its expansion depending on the structure of the seats. Comparing experimental results with numerical results for points A and B allowed to determine the existing differences. It follows from it that higher stress values are obtained by numerical calculations in the range of 0.2 MPa to 35.9 MPa.

  9. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  10. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  11. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence

  12. Numerical method for IR background and clutter simulation

    Science.gov (United States)

    Quaranta, Carlo; Daniele, Gina; Balzarotti, Giorgio

    1997-06-01

    The paper describes a fast and accurate algorithm of IR background noise and clutter generation for application in scene simulations. The process is based on the hypothesis that background might be modeled as a statistical process where amplitude of signal obeys to the Gaussian distribution rule and zones of the same scene meet a correlation function with exponential form. The algorithm allows to provide an accurate mathematical approximation of the model and also an excellent fidelity with reality, that appears from a comparison with images from IR sensors. The proposed method shows advantages with respect to methods based on the filtering of white noise in time or frequency domain as it requires a limited number of computation and, furthermore, it is more accurate than the quasi random processes. The background generation starts from a reticule of few points and by means of growing rules the process is extended to the whole scene of required dimension and resolution. The statistical property of the model are properly maintained in the simulation process. The paper gives specific attention to the mathematical aspects of the algorithm and provides a number of simulations and comparisons with real scenes.

  13. Verification on spray simulation of a pintle injector for liquid rocket engine

    Science.gov (United States)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  14. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  15. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  16. Experimental and numerical analysis for magnetically induced vibrations of conducting structure

    International Nuclear Information System (INIS)

    Nishio, Satoshi; Nakahira, Masataka; Miura, H.; Isono, A.

    1993-01-01

    The coupling effect between the electromagnetic field and mechanical response of a conducting structure is of importance in magnetic fusion devices as tokamak machine. The electromagnetically induced motion of the structure due to the Lorentz force induces additional eddy currents and further modifies the dynamic characteristics of the system. This paper is concerned with numerical modeling of the dynamic field-structure interaction and its verification by experimental tests. Here, a finite element numerical model for mechanical deformation and a wiregrid numerical model for eddy currents are employed for non-ferrous and elastic conductors. A computer code has been developed for 3-D thin shell structure. Experimental tests for the code verification were carried out by using a rectangular thin copper plate. Three kinds of the plate supporting systems, i.e., a cantilever system, a fixed both ends system and a simply supported ends system were investigated. A good agreement between the numerical and experimental results was obtained. Therefore, the computer code developed here is available for analyzing the electromagnetomechanical behavior of the plasma facing components of the tokamak device. (author)

  17. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  18. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  19. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  20. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  1. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  2. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  3. The analytical and numerical study of the fluorination of uranium dioxide particles

    International Nuclear Information System (INIS)

    Sazhin, S.S.

    1997-01-01

    A detailed analytical study of the equations describing the fluorination of UO 2 particles is presented for some limiting cases assuming that the mass flowrate of these particles is so small that they do not affect the state of the gas. The analytical solutions obtained can be used for approximate estimates of the effect of fluorination on particle diameter and temperature but their major application, however, is probably in the verification of self-consistent numerical solutions. Computational results are presented and discussed for a self-consistent problem in which both the effects of gas on particles and particles on gas are accounted for. It has been shown that in the limiting cases for which analytical solutions have been obtained, the coincidence between numerical and analytical results is almost exact. This can be considered as a verification of both the analytical and numerical solutions. (orig.)

  4. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  5. An Investigation into Solution Verification for CFD-DEM

    Energy Technology Data Exchange (ETDEWEB)

    Fullmer, William D. [National Energy Technology Lab. (NETL), AECOM, Morgantown, WV (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2017-10-01

    This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of an experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing

  6. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  7. Numerical relativity

    CERN Document Server

    Shibata, Masaru

    2016-01-01

    This book is composed of two parts: First part describes basics in numerical relativity, that is, the formulations and methods for a solution of Einstein's equation and general relativistic matter field equations. This part will be helpful for beginners of numerical relativity who would like to understand the content of numerical relativity and its background. The second part focuses on the application of numerical relativity. A wide variety of scientific numerical results are introduced focusing in particular on the merger of binary neutron stars and black holes.

  8. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  9. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  10. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  11. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  12. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  13. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  14. Phantoms for IMRT dose distribution measurement and treatment verification

    International Nuclear Information System (INIS)

    Low, Daniel A.; Gerber, Russell L.; Mutic, Sasa; Purdy, James A.

    1998-01-01

    Background: The verification of intensity-modulated radiation therapy (IMRT) patient treatment dose distributions is currently based on custom-built or modified dose measurement phantoms. The only commercially available IMRT treatment planning and delivery system (Peacock, NOMOS Corp.) is supplied with a film phantom that allows accurate spatial localization of the dose distribution using radiographic film. However, measurements using other dosimeters are necessary for the thorough verification of IMRT. Methods: We have developed a phantom to enable dose measurements using a cylindrical ionization chamber and the localization of prescription isodose curves using a matrix of thermoluminescent dosimetry (TLD) chips. The external phantom cross-section is identical to that of the commercial phantom, to allow direct comparisons of measurements. A supplementary phantom has been fabricated to verify the IMRT dose distributions for pelvis treatments. Results: To date, this phantom has been used for the verification of IMRT dose distributions for head and neck and prostate cancer treatments. Designs are also presented for a phantom insert to be used with polymerizing gels (e.g., BANG-2) to obtain volumetric dose distribution measurements. Conclusion: The phantoms have proven useful in the quantitative evaluation of IMRT treatments

  15. Lightning initiation mechanism based on the development of relativistic runaway electron avalanches triggered by background cosmic radiation: Numerical simulation

    International Nuclear Information System (INIS)

    Babich, L. P.; Bochkov, E. I.; Kutsyk, I. M.

    2011-01-01

    The mechanism of lightning initiation due to electric field enhancement by the polarization of a conducting channel produced by relativistic runaway electron avalanches triggered by background cosmic radiation has been simulated numerically. It is shown that the fields at which the start of a lightning leader is possible even in the absence of precipitations are locally realized for realistic thundercloud configurations and charges. The computational results agree with the in-situ observations of penetrating radiation enhancement in thunderclouds.

  16. Lightning initiation mechanism based on the development of relativistic runaway electron avalanches triggered by background cosmic radiation: Numerical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Babich, L. P., E-mail: babich@elph.vniief.ru; Bochkov, E. I.; Kutsyk, I. M. [All-Russian Research Institute of Experimental Physics, Russian Federal Nuclear Center (Russian Federation)

    2011-05-15

    The mechanism of lightning initiation due to electric field enhancement by the polarization of a conducting channel produced by relativistic runaway electron avalanches triggered by background cosmic radiation has been simulated numerically. It is shown that the fields at which the start of a lightning leader is possible even in the absence of precipitations are locally realized for realistic thundercloud configurations and charges. The computational results agree with the in-situ observations of penetrating radiation enhancement in thunderclouds.

  17. Verification of Monte Carlo transport codes by activation experiments

    OpenAIRE

    Chetvertkova, Vera

    2013-01-01

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is...

  18. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    International Nuclear Information System (INIS)

    Weaver, Phyllis C.

    2012-01-01

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site's conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse

  20. Methods of numerical relativity

    International Nuclear Information System (INIS)

    Piran, T.

    1983-01-01

    Numerical Relativity is an alternative to analytical methods for obtaining solutions for Einstein equations. Numerical methods are particularly useful for studying generation of gravitational radiation by potential strong sources. The author reviews the analytical background, the numerical analysis aspects and techniques and some of the difficulties involved in numerical relativity. (Auth.)

  1. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  2. A bimodal verification cryptosystem as a framework against spoofing attacks

    OpenAIRE

    Toli, Christina-Angeliki; Preneel, Bart

    2015-01-01

    The exponential growth of immigration crisis and the recent terrorism cases revealed the increase of fraud occurrences, cloning and identity theft with numerous social, economic and political consequences. The trustworthiness of biometrics during verification processes has been compromised by spoofing attackers sprang up to exploit the security gaps. Additionally, the cryptography’s role in the area is highly important as it may promote fair assessment procedures and foster public trust by se...

  3. 22 CFR 97.3 - Requirements subject to verification in an outgoing Convention case.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Requirements subject to verification in an outgoing Convention case. 97.3 Section 97.3 Foreign Relations DEPARTMENT OF STATE LEGAL AND RELATED... background study. An accredited agency, temporarily accredited agency, or public domestic authority must...

  4. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  5. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  6. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  7. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  8. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  9. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    Science.gov (United States)

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  10. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    uses the 'Actel Libero IDE' (internally, 'Synopsys Synplify Pro') to synthesize Net list from the Verilog program, and also uses the 'EDIFtoBLIF-MV' to translate Net list into BLIF-MV. The VIS verification system is then used to prove the behavioral equivalence. This paper is organized as follows: Section 2 provides background information. Section 3 explains the developed tool, which translates EDIF to BLIF-MV. A case study with Verilog examples of a Korean nuclear power plant is presented in Section 4 and Section 5 concludes the paper and provides remarks on future research extension. This paper proposes a formal verification technique which can contribute to the correctness demonstration of commercial FPGA synthesis processes and tools in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog program. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIF-MV,' which translate EDIF into BLIF-MV, while preserving their behavior equivalence. The translation from EDIF into BLIF-MV consists of three steps Parsing, Pro-processing and Translation. We performed the case study with Verilog programs designed for a digital I and C system in Korea

  11. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    uses the 'Actel Libero IDE' (internally, 'Synopsys Synplify Pro') to synthesize Net list from the Verilog program, and also uses the 'EDIFtoBLIF-MV' to translate Net list into BLIF-MV. The VIS verification system is then used to prove the behavioral equivalence. This paper is organized as follows: Section 2 provides background information. Section 3 explains the developed tool, which translates EDIF to BLIF-MV. A case study with Verilog examples of a Korean nuclear power plant is presented in Section 4 and Section 5 concludes the paper and provides remarks on future research extension. This paper proposes a formal verification technique which can contribute to the correctness demonstration of commercial FPGA synthesis processes and tools in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog program. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIF-MV,' which translate EDIF into BLIF-MV, while preserving their behavior equivalence. The translation from EDIF into BLIF-MV consists of three steps Parsing, Pro-processing and Translation. We performed the case study with Verilog programs designed for a digital I and C system in Korea.

  12. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  13. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  14. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  15. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  16. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  17. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  18. An international cooperative verification agenda for arms reduction

    International Nuclear Information System (INIS)

    Hinderstein, C.

    2013-01-01

    The biggest challenge to the overall verification and monitoring agenda for future arms reductions may be that posed by uncertainties regarding the quantities of existing stocks of fissile material and nuclear weapons. We must develop strategies to reduce the residual uncertainties regarding completeness of initial declarations as all declared weapons-related inventories go to zero. Establishing this confidence in countries' initial baseline declarations will likely be a key point in all states' decisions to move to very low numbers, much less zero. The author reviews the questions and challenges that need to be addressed if there is to be significant progress in negotiating and implementing a verifiable fissile material cutoff treaty (FMCT) and a policy of nuclear weapon dismantling. In support of greater security as the world works towards the elimination of nuclear weapons, individual States could begin immediately by increasing the transparency of their nuclear activities. The International Verification Project is designed to bring experts from a wide array of related backgrounds together to build capacity for verification internationally in support of arms control goals (and in support of the larger objective of a world without nuclear weapons), build confidence between nuclear and non-nuclear-weapon states, promote freer flow of information among governments and between governments and non-governmental organizations (NGOs) and solve technical problems that could be barriers to progress. The paper is followed by the slides of the presentation. (A.C.)

  19. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  20. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  1. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  2. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  3. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    International Nuclear Information System (INIS)

    Visser, Ruurd; Wauben, David J.L.; Groot, Martijn de; Steenbakkers, Roel J.H.M.; Bijl, Henk P.; Godart, Jeremy; Veld, Aart A. van’t; Langendijk, Johannes A.; Korevaar, Erik W.

    2014-01-01

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for this. The aim of this study was to evaluate DVH-based treatment plan verification in addition to the GI evaluation method for head and neck IMRT. Materials and methods: Dose verifications of 700 subsequent head and neck cancer IMRT treatment plans were categorised according to gamma and DVH-based action levels. Fractionation dependent absolute dose limits were chosen. The results of the gamma- and DVH-based evaluations were compared to the decision of the medical physicist and/or radiation oncologist for plan acceptance. Results: Nearly all treatment plans (99.7%) were accepted for treatment according to the GI evaluation combined with DVH-based verification. Two treatment plans were re-planned according to DVH-based verification, which would have been accepted using the evaluation alone. DVH-based verification increased insight into dose delivery to patient specific structures increasing confidence that the treatment plans were clinically acceptable. Moreover, DVH-based action levels clearly distinguished the role of the medical physicist and radiation oncologist within the Quality Assurance (QA) procedure. Conclusions: DVH-based treatment plan verification complements the GI evaluation method improving head and neck IMRT-QA

  4. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  5. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  6. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    OpenAIRE

    Clark, T Justin; ter Riet, Gerben; Coomarasamy, Aravinthan; Khan, Khalid S

    2004-01-01

    Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects). Of these, 16 had immediate histological verification of diagnosis while 11 ha...

  7. A Semi-implicit Numerical Scheme for a Two-dimensional, Three-field Thermo-Hydraulic Modeling

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Jeong, Jaejoon

    2007-07-01

    The behavior of two-phase flow is modeled, depending on the purpose, by either homogeneous model, drift flux model, or separated flow model, Among these model, in the separated flow model, the behavior of each flow phase is modeled by its own governing equation, together with the interphase models which describe the thermal and mechanical interactions between the phases involved. In this study, a semi-implicit numerical scheme for two-dimensional, transient, two-fluid, three-field is derived. The work is an extension to the previous study for the staggered, semi-implicit numerical scheme in one-dimensional geometry (KAERI/TR-3239/2006). The two-dimensional extension is performed by specifying a relevant governing equation set and applying the related finite differencing method. The procedure for employing the semi-implicit scheme is also described in detail. Verifications are performed for a 2-dimensional vertical plate for a single-phase and two-phase flows. The calculations verify the mass and energy conservations. The symmetric flow behavior, for the verification problem, also confirms the momentum conservation of the numerical scheme

  8. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  9. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  10. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  11. Numerical simulation of MHD flows in inhomogeneous and instationary magnetic fields

    International Nuclear Information System (INIS)

    Ehrhard, Sebastian

    2016-01-01

    In this work, I develop a numerical model for magnetohydrodynamic flows in unsteady an inhomogeneous flow. The model is implemented in the finite-volume based CFD-code OpenFOAM. Some verification and validation tests are made on several standard problems of magnetohydrodynamics. Finally I successful modelled an electromagnetic flowmeter with the code.

  12. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  13. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  14. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  15. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  16. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    Directory of Open Access Journals (Sweden)

    Coomarasamy Aravinthan

    2004-05-01

    Full Text Available Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects. Of these, 16 had immediate histological verification of diagnosis while 11 had verification delayed > 24 hrs after testing. The effect of delay in verification of diagnosis on estimates of accuracy was evaluated using meta-regression with diagnostic odds ratio (dOR as the accuracy measure. This analysis was adjusted for study quality and type of test (miniature endometrial biopsy or endometrial ultrasound. Results Compared to studies with immediate verification of diagnosis (dOR 67.2, 95% CI 21.7–208.8, those with delayed verification (dOR 16.2, 95% CI 8.6–30.5 underestimated the diagnostic accuracy by 74% (95% CI 7%–99%; P value = 0.048. Conclusion Among studies of miniature endometrial biopsy and endometrial ultrasound, diagnostic accuracy is considerably underestimated if there is a delay in histological verification of diagnosis.

  17. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  18. Numerical model for verification of constitutive laws of blood vessel wall

    Czech Academy of Sciences Publication Activity Database

    Macková, H.; Chlup, Hynek; Žitný, R.

    -, 2/1 (2007), s. 66-66 ISSN 1880-9863 Institutional research plan: CEZ:AV0Z20760514 Keywords : constitutive law * numerical model * pulse wave velocity Subject RIV: BK - Fluid Dynamics http://www.jstage.jst.go.jp/browse/jbse/2/Suppl.1/_contents

  19. Numerical investigation of the late-time Kerr tails

    Energy Technology Data Exchange (ETDEWEB)

    Racz, Istvan; Toth, Gabor Zs, E-mail: iracz@rmki.kfki.hu, E-mail: tgzs@rmki.kfki.hu [RMKI, H-1121 Budapest, Konkoly Thege Miklos ut 29-33 (Hungary)

    2011-10-07

    The late-time behavior of a scalar field on fixed Kerr background is examined in a numerical framework incorporating the techniques of conformal compactification and hyperbolic initial value formulation. The applied code is 1+(1+2) as it is based on the use of the spectral method in the angular directions while in the time-radial section fourth order finite differencing, along with the method of lines, is applied. The evolution of various types of stationary and non-stationary pure multipole initial states are investigated. The asymptotic decay rates are determined not only in the domain of outer communication but along the event horizon and at future null infinity as well. The decay rates are found to be different for stationary and non-stationary initial data, and they also depend on the fall off properties of the initial data toward future null infinity. The energy and angular momentum transfers are found to show significantly different behavior in the initial phase of the time evolution. The quasinormal ringing phase and the tail phase are also investigated. In the tail phase, the decay exponents for the energy and angular momentum losses at I{sup +} are found to be smaller than at the horizon which is in accordance with the behavior of the field itself and it means that at late times the energy and angular momentum falling into the black hole become negligible in comparison with the energy and angular momentum radiated toward I{sup +}. The energy and angular momentum balances are used as additional verifications of the reliability of our numerical method.

  20. Numerical investigation of the late-time Kerr tails

    International Nuclear Information System (INIS)

    Racz, Istvan; Toth, Gabor Zs

    2011-01-01

    The late-time behavior of a scalar field on fixed Kerr background is examined in a numerical framework incorporating the techniques of conformal compactification and hyperbolic initial value formulation. The applied code is 1+(1+2) as it is based on the use of the spectral method in the angular directions while in the time-radial section fourth order finite differencing, along with the method of lines, is applied. The evolution of various types of stationary and non-stationary pure multipole initial states are investigated. The asymptotic decay rates are determined not only in the domain of outer communication but along the event horizon and at future null infinity as well. The decay rates are found to be different for stationary and non-stationary initial data, and they also depend on the fall off properties of the initial data toward future null infinity. The energy and angular momentum transfers are found to show significantly different behavior in the initial phase of the time evolution. The quasinormal ringing phase and the tail phase are also investigated. In the tail phase, the decay exponents for the energy and angular momentum losses at I + are found to be smaller than at the horizon which is in accordance with the behavior of the field itself and it means that at late times the energy and angular momentum falling into the black hole become negligible in comparison with the energy and angular momentum radiated toward I + . The energy and angular momentum balances are used as additional verifications of the reliability of our numerical method.

  1. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  2. Chemical transport in a fissured rock: verification of a numerical model

    International Nuclear Information System (INIS)

    Rasmuson, A.; Narasimham, T.N.; Neretnieks.

    1982-01-01

    Due to the very long-term, high toxicity of some nuclear waste products, models are required to predict, in certain cases, the spatial and temporal distribution of chemical concentration less than 0.001% of the concentration released from the repository. A numerical model, TRUMP, which solves the advective diffusion equation in general three dimensions, with or without decay and source term has been verified. The method is based on an integrated finite difference approach. The studies show that as long as the magnitude of advectance is equal to or less than that of conductance for the closed surface bonding any volume element in the region (that is, numerical Peclet number -3 % or less. The realistic input parameters used in the sample calculations suggest that such a range of Peclet numbers is indeed likely to characterize deep groundwater systems in granitic and ancient argillaceous systems. A sensitivity analysis based on the errors in prediction introduced due to uncertainties in input parameters are likely to be larger than the computational inaccuracies introduced by the numerical model. Currently, a disadvantage in the TRUMP model is that the iterative method of solving the set of simultaneous equations is rather slow when time constants vary widely over the flow region. Although the iterative solution may be very desirable for large three-dimensional problems in order to minimize computer storage, it seems desirable to use a direct solver technique in conjunction with the mixed explicit-implicit approach whenever possible. Work in this direction is in progress

  3. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  4. Calibration and verification of numerical runoff and erosion model

    Directory of Open Access Journals (Sweden)

    Gabrić Ognjen

    2015-01-01

    Full Text Available Based on the field and laboratory measurements, and analogous with development of computational techniques, runoff and erosion models based on equations which describe the physics of the process are also developed. Based on the KINEROS2 model, this paper presents basic modelling principles of runoff and erosion processes based on the St. Venant's equations. Alternative equations for friction calculation, calculation of source and deposition elements and transport capacity are also shown. Numerical models based on original and alternative equations are calibrated and verified on laboratory scale model. According to the results, friction calculation based on the analytic solution of laminar flow must be included in all runoff and erosion models.

  5. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  6. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  7. Numerical simulation and experimental verification of a flat two-phase thermosyphon

    International Nuclear Information System (INIS)

    Zhang Ming; Liu Zhongliang; Ma Guoyuan; Cheng Shuiyuan

    2009-01-01

    The flat two-phase thermosyphon is placed between the heat source and the heat sink, which can achieve the uniform heat flux distribution and improve the performance of heat sink. In this paper, a two-dimensional heat and mass transfer model for a disk-shaped flat two-phase thermosyphon is developed. By solving the equations of continuity, momentum and energy numerically, the vapor velocity and temperature distributions of the flat two-phase thermosyphon are obtained. An analysis is also carried out on the ability of flat two-phase thermosyphon to spread heat and remove hot spots. In order to observe boiling and condensation phenomena, a transparent flat two-phase thermosyphon is manufactured and studied experimentally. The experimental results are compared with numerical results, which verify the physical and mathematical model of the flat two-phase thermosyphon. In order to study the main factors affecting the axial thermal resistance of two-phase thermosyphon, the temperatures inside the flat two-phase thermosyphon are measured and analyzed

  8. URANS simulations of the tip-leakage cavitating flow with verification and validation procedures

    Science.gov (United States)

    Cheng, Huai-yu; Long, Xin-ping; Liang, Yun-zhi; Long, Yun; Ji, Bin

    2018-04-01

    In the present paper, the Vortex Identified Zwart-Gerber-Belamri (VIZGB) cavitation model coupled with the SST-CC turbulence model is used to investigate the unsteady tip-leakage cavitating flow induced by a NACA0009 hydrofoil. A qualitative comparison between the numerical and experimental results is made. In order to quantitatively evaluate the reliability of the numerical data, the verification and validation (V&V) procedures are used in the present paper. Errors of numerical results are estimated with seven error estimators based on the Richardson extrapolation method. It is shown that though a strict validation cannot be achieved, a reasonable prediction of the gross characteristics of the tip-leakage cavitating flow can be obtained. Based on the numerical results, the influence of the cavitation on the tip-leakage vortex (TLV) is discussed, which indicates that the cavitation accelerates the fusion of the TLV and the tip-separation vortex (TSV). Moreover, the trajectory of the TLV, when the cavitation occurs, is close to the side wall.

  9. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  10. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  11. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  12. Numerical methods: Analytical benchmarking in transport theory

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1988-01-01

    Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered

  13. Analytical and numerical verification of the Nernst theorem for metals

    International Nuclear Information System (INIS)

    Hoeye, Johan S.; Brevik, Iver; Ellingsen, Simen A.; Aarseth, Jan B.

    2007-01-01

    In view of the current discussion on the subject, an effort is made to show very accurately both analytically and numerically how the Drude dispersion model gives consistent results for the Casimir free energy at low temperatures. Specifically, for the free energy near T=0 we find the leading term proportional to T 2 and the next-to-leading term proportional to T 5-2 . These terms give rise to zero Casimir entropy as T→0 and are thus in accordance with Nernst's theorem

  14. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    Science.gov (United States)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  15. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    International Nuclear Information System (INIS)

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  16. Measurement of the Operating Parameters and Numerical Analysis of the Mechanical Subsystem

    Directory of Open Access Journals (Sweden)

    Božek Pavol

    2014-08-01

    Full Text Available Submission is focused on completing the information system about quality, operation, automatic testing and new evaluating method of vehicle subsystem. Numeric analysis is carried out on the base of automatic collection and systematic recording of commercial car operation. Proposed new information system about operation and trial process allows verification according to the proposed method. Critical components verified in laboratory conditions are detected by numeric analysis of reliability. Quality level increasing not only for final product, but also related automatic test laboratory for cars is the result of respecting these principles.

  17. Measurement of the Operating Parameters and Numerical Analysis of the Mechanical Subsystem

    Science.gov (United States)

    Božek, Pavol; Turygin, Yuri

    2014-08-01

    Submission is focused on completing the information system about quality, operation, automatic testing and new evaluating method of vehicle subsystem. Numeric analysis is carried out on the base of automatic collection and systematic recording of commercial car operation. Proposed new information system about operation and trial process allows verification according to the proposed method. Critical components verified in laboratory conditions are detected by numeric analysis of reliability. Quality level increasing not only for final product, but also related automatic test laboratory for cars is the result of respecting these principles.

  18. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    The determination of soil background is one of the most important activities supporting environmental restoration and waste management on the Hanford Site. Background compositions serve as the basis for identifying soil contamination, and also as a baseline in risk assessment processes used to determine soil cleanup and treatment levels. These uses of soil background require an understanding of the extent to which analytes of concern occur naturally in the soils. This report documents the results of sampling and analysis activities designed to characterize the composition of soil background at the Hanford Site, and to evaluate the feasibility for use as Sitewide background. The compositions of naturally occurring soils in the vadose Zone have been-determined for-nonradioactive inorganic and organic analytes and related physical properties. These results confirm that a Sitewide approach to the characterization of soil background is technically sound and is a viable alternative to the determination and use of numerous local or area backgrounds that yield inconsistent definitions of contamination. Sitewide soil background consists of several types of data and is appropriate for use in identifying contamination in all soils in the vadose zone on the Hanford Site. The natural concentrations of nearly every inorganic analyte extend to levels that exceed calculated health-based cleanup limits. The levels of most inorganic analytes, however, are well below these health-based limits. The highest measured background concentrations occur in three volumetrically minor soil types, the most important of which are topsoils adjacent to the Columbia River that are rich in organic carbon. No organic analyte levels above detection were found in any of the soil samples

  19. EQ3/6 software test and verification report 9/94

    International Nuclear Information System (INIS)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ''V and V'' report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT

  20. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  1. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  2. Study of natural convection heat transfer characteristics. (2) Verification for numerical simulation

    International Nuclear Information System (INIS)

    Ikeda, Hiroshi; Nakada, Kotaro; Ikeda, Tatsumi; Wakamatsu, Mitsuo; Iwaki, Chikako; Morooka, Shinichi; Masaki, Yoshikazu

    2008-01-01

    In the natural cooling system for waste storage, it is important to evaluate the flow by natural draft enough to remove the decay heat from the waste. In this study, we carried out the fundamental study of natural convection on vertical cylindrical heater by experiment and numerical simulation. The dimension of test facility is about 4m heights with single heater. Heating power is varied in the range of 33-110W, where Rayleigh number is over 10 10 . We surveyed the velocity distribution around heater by some turbulent models, mesh sizes around heated wall and turbulent Prandtl numbers. Results of numerical simulation of the velocity distribution and averaged heat transfer coefficient agreed well with experimental data and references. (author)

  3. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  4. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  5. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  6. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  7. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...

  8. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  9. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  10. Verification of the STELE code within the framework of the CHEMVAL project

    International Nuclear Information System (INIS)

    Jamet, Ph.; Jacquier, Ph.

    1990-07-01

    Phase 3 of the international CHEMVAL exercise has made it possible to devise verification tests for models coupling geochemistry and transport. The STELE model, built at the Centre d'Informatique Geologique, is a two-step coupled model, solving the geochemical and transport equations in two stages. The solution of two of the exercises from phase 3 of CHEMVAL is presented. One has an analytical solution which compares satisfactorily with the results of the calculation by STELE. The other raises numerical problems typical of a two-step algorithm. In its present state the STELE model is nevertheless a calculation tool well suited to systems with weak geochemical contrasts. Future development of STELE should mainly focus on the numerical aspects in order to extend the applicability of the model. (author)

  11. Dose concentration and dose verification for radiotherapy of cancer

    International Nuclear Information System (INIS)

    Maruyama, Koichi

    2005-01-01

    The number of cancer treatments using radiation therapy is increasing. The background of this increase is the accumulated fact that the number of successful cases is comparative to or even better than surgery for some types of cancer due to the improvement in irradiation technology and radiation planning technology. This review describes the principles and technology of radiation therapy, its characteristics, particle therapy that improves the dose concentration, its historical background, the importance of dose concentration, present situation and future possibilities. There are serious problems that hinder the superior dose concentration of particle therapy. Recent programs and our efforts to solve these problems are described. A new concept is required to satisfy the notion of evidence based medicine, i.e., one has to develop a method of dose verification, which is not yet available. This review is for researchers, medical doctors and radiation technologists who are developing this field. (author)

  12. Knowledge-based verification of clinical guidelines by detection of anomalies.

    Science.gov (United States)

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  13. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  14. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  15. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  16. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  17. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  18. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  19. Numerical Analysis and Experimental Verification of Stresses Building up in Microelectronics Packaging

    NARCIS (Netherlands)

    Rezaie Adli, A.R.

    2017-01-01

    This thesis comprises a thorough study of the microelectronics packaging process by means of various experimental and numerical methods to estimate the process induced residual stresses. The main objective of the packaging is to encapsulate the die, interconnections and the other exposed internal

  20. Numerical analysis of modified Central Solenoid insert design

    Energy Technology Data Exchange (ETDEWEB)

    Khodak, Andrei, E-mail: akhodak@pppl.gov [Princeton Plasma Physics Laboratory, Princeton, NJ (United States); Martovetsky, Nicolai; Smirnov, Aleksandre [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Titus, Peter [Princeton Plasma Physics Laboratory, Princeton, NJ (United States)

    2015-10-15

    Highlights: • Modified design of coil for testing ITER superconducting cable is presented. • Numerical analysis allowed design verification. • Three-dimensional current sharing temperature distributions are obtained from the results. - Abstract: The United States ITER Project Office (USIPO) is responsible for fabrication of the Central Solenoid (CS) for ITER project. The ITER machine is currently under construction by seven parties in Cadarache, France. The CS Insert (CSI) project should provide a verification of the conductor performance in relevant conditions of temperature, field, currents and mechanical strain. The US IPO designed the CSI that will be tested at the Central Solenoid Model Coil (CSMC) Test Facility at JAEA, Naka. To validate the modified design three-dimensional numerical simulations were performed using coupled solver for simultaneous structural, thermal and electromagnetic analysis. Thermal and electromagnetic simulations supported structural calculations providing necessary loads and strains. According to current analysis design of the modified coil satisfies ITER magnet structural design criteria for the following conditions: (1) room temperature, no current, (2) temperature 4 K, no current, (3) temperature 4 K, current 60 kA direct charge, and (4) temperature 4 K, current 60 kA reverse charge. Fatigue life assessment analysis is performed for the alternating conditions of: temperature 4 K, no current, and temperature 4 K, current 45 kA direct charge. Results of fatigue analysis show that parts of the coil assembly can be qualified for up to 1 million cycles. Distributions of the Current Sharing Temperature (TCS) in the superconductor were obtained from numerical results using parameterization of the critical surface in the form similar to that proposed for ITER. Special ADPL scripts were developed for ANSYS allowing one-dimensional representation of TCS along the cable, as well as three-dimensional fields of TCS in superconductor

  1. Energy- and time-resolved detection of prompt gamma-rays for proton range verification.

    Science.gov (United States)

    Verburg, Joost M; Riley, Kent; Bortfeld, Thomas; Seco, Joao

    2013-10-21

    In this work, we present experimental results of a novel prompt gamma-ray detector for proton beam range verification. The detection system features an actively shielded cerium-doped lanthanum(III) bromide scintillator, coupled to a digital data acquisition system. The acquisition was synchronized to the cyclotron radio frequency to separate the prompt gamma-ray signals from the later-arriving neutron-induced background. We designed the detector to provide a high energy resolution and an effective reduction of background events, enabling discrete proton-induced prompt gamma lines to be resolved. Measuring discrete prompt gamma lines has several benefits for range verification. As the discrete energies correspond to specific nuclear transitions, the magnitudes of the different gamma lines have unique correlations with the proton energy and can be directly related to nuclear reaction cross sections. The quantification of discrete gamma lines also enables elemental analysis of tissue in the beam path, providing a better prediction of prompt gamma-ray yields. We present the results of experiments in which a water phantom was irradiated with proton pencil-beams in a clinical proton therapy gantry. A slit collimator was used to collimate the prompt gamma-rays, and measurements were performed at 27 positions along the path of proton beams with ranges of 9, 16 and 23 g cm(-2) in water. The magnitudes of discrete gamma lines at 4.44, 5.2 and 6.13 MeV were quantified. The prompt gamma lines were found to be clearly resolved in dimensions of energy and time, and had a reproducible correlation with the proton depth-dose curve. We conclude that the measurement of discrete prompt gamma-rays for in vivo range verification of clinical proton beams is feasible, and plan to further study methods and detector designs for clinical use.

  2. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  3. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  4. Numerical Study of the Ghost-Ghost-Gluon Vertex on the Lattice

    International Nuclear Information System (INIS)

    Mihara, A.; Cucchieri, A.; Mendes, T.

    2004-01-01

    It is well known that, in Landau gauge, the renormalization function of the ghost-ghost-gluon vertex Z-tilde1 (p2) is finite and constant, at least to all orders of perturbation theory. On the other hand, a direct non-perturbative verification of this result using numerical simulations of lattice QCD is still missing. Here we present a preliminary numerical study of the ghost-ghost-gluon vertex and of its corresponding renormalization function using Monte Carlo simulations in SU(2) lattice Landau gauge. Data were obtained in 4 dimensions for lattice couplings β = 2.2, 2.3, 2.4 and lattice sides N = 4, 8, 16

  5. Numerical study of the ghost-ghost-gluon vertex on the lattice

    International Nuclear Information System (INIS)

    Mihara, A.; Cucchieri, A.; Mendes, T.

    2004-01-01

    It is well known that, in Landau gauge, the renormalization function of the ghost-ghost-gluon vertex Z∼ 1 1(p 2 ) is finite and constant, at least to all orders of perturbation theory. On the other hand, a direct non-perturbative verification of this result using numerical simulations of lattice QCD is still missing. Here we present a preliminary numerical study of the ghost-ghost-gluon vertex and of its corresponding renormalization function using Monte Carlo simulations in SU(2) lattice Landau gauge. Data were obtained in 4 dimensions for lattice couplings β= 2.2, 2.3, 2.4 and lattice sides N = 4, 8, 16. (author)

  6. Point splitting in a curved space-time background

    International Nuclear Information System (INIS)

    Liggatt, P.A.J.; Macfarlane, A.J.

    1979-01-01

    A prescription is given for point splitting in a curved space-time background which is a natural generalization of that familiar in quantum electrodynamics and Yang-Mills theory. It is applied (to establish its validity) to the verification of the gravitational anomaly in the divergence of a fermion axial current. Notable features of the prescription are that it defines a point-split current that can be differentiated straightforwardly, and that it involves a natural way of averaging (four-dimensionally) over the directions of point splitting. The method can extend directly from the spin-1/2 fermion case treated to other cases, e.g., to spin-3/2 Rarita-Schwinger fermions. (author)

  7. Entanglement verification and its applications in quantum communication

    International Nuclear Information System (INIS)

    Haeseler, Hauke

    2010-01-01

    In this thesis, we investigate the uses of entanglement and its verification in quantum communication. The main object here is to develop a verification procedure which is adaptable to a wide range of applications, and whose implementation has low requirements on experimental resources. We present such a procedure in the form of the Expectation Value Matrix. The structure of this thesis is as follows: Chapters 1 and 2 give a short introduction and background information on quantum theory and the quantum states of light. In particular, we discuss the basic postulates of quantum mechanics, quantum state discrimination, the description of quantum light and the homodyne detector. Chapter 3 gives a brief introduction to quantum information and in particular to entanglement, and we discuss the basics of quantum key distribution and teleportation. The general framework of the Expectation Value Matrix is introduced. The main matter of this thesis is contained in the subsequent three chapters, which describe different quantum communication protocols and the corresponding adaptation of the entanglement verification method. The subject of Chapter 4 is quantum key distribution, where the detection of entanglement is a means of excluding intercept-resend attacks, and the presence of quantum correlations in the raw data is a necessary precondition for the generation of secret key. We investigate a continuous-variable version of the two-state protocol and develop the Expectation Value Matrix method for such qubit-mode systems. Furthermore, we analyse the role of the phase reference with respect to the security of the protocol and raise awareness of a corresponding security threat. For this, we adapt the verification method to different settings of Stokes operator measurements. In Chapter 5, we investigate quantum memory channels and propose a fundamental benchmark for these based on the verification of entanglement. After describing some physical effects which can be used for the

  8. SIVEH: Numerical Computing Simulation of Wireless Energy-Harvesting Sensor Nodes

    Directory of Open Access Journals (Sweden)

    Pedro Yuste

    2013-09-01

    Full Text Available The paper presents a numerical energy harvesting model for sensor nodes, SIVEH (Simulator I–V for EH, based on I–V hardware tracking. I–V tracking is demonstrated to be more accurate than traditional energy modeling techniques when some of the components present different power dissipation at either different operating voltages or drawn currents. SIVEH numerical computing allows fast simulation of long periods of time—days, weeks, months or years—using real solar radiation curves. Moreover, SIVEH modeling has been enhanced with sleep time rate dynamic adjustment, while seeking energy-neutral operation. This paper presents the model description, a functional verification and a critical comparison with the classic energy approach.

  9. Poster - 16: Time-resolved diode dosimetry for in vivo proton therapy range verification: calibration through numerical modeling

    Energy Technology Data Exchange (ETDEWEB)

    Toltz, Allison; Hoesl, Michaela; Schuemann, Jan; Seuntjens, Jan; Lu, Hsiao-Ming; Paganetti, Harald [McGill University, Harvard University, Massachusetts General Hospital, McGill University, Massachusetts General Hospital, Massachusetts General Hospital (United States)

    2016-08-15

    Purpose: A method to refine the implementation of an in vivo, adaptive proton therapy range verification methodology was investigated. Simulation experiments and in-phantom measurements were compared to validate the calibration procedure of a time-resolved diode dosimetry technique. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification by correlating properties of the detector signal to the water equivalent path length (WEPL). The implementation of this system requires a set of calibration measurements to establish a beam-specific diode response to WEPL fit for the selected ‘scout’ beam in a solid water phantom. This process is both tedious, as it necessitates a separate set of measurements for every ‘scout’ beam that may be appropriate to the clinical case, as well as inconvenient due to limited access to the clinical beamline. The diode response to WEPL relationship for a given ‘scout’ beam may be determined within a simulation environment, facilitating the applicability of this dosimetry technique. Measurements for three ‘scout’ beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). Results: Detector response in water equivalent plastic was successfully validated against simulation for spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) with adjusted R{sup 2} of 0.998. Conclusion: Feasibility has been shown for performing calibration of detector response for a given ‘scout’ beam through simulation for the time resolved diode dosimetry technique.

  10. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  11. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  12. Numerical solution of stiff burnup equation with short half lived nuclides by the Krylov subspace method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro; Sugimura, Naoki

    2007-01-01

    The Krylov subspace method is applied to solve nuclide burnup equations used for lattice physics calculations. The Krylov method is an efficient approach for solving ordinary differential equations with stiff nature such as the nuclide burnup with short lived nuclides. Some mathematical fundamentals of the Krylov subspace method and its application to burnup equations are discussed. Verification calculations are carried out in a PWR pin-cell geometry with UO 2 fuel. A detailed burnup chain that includes 193 fission products and 28 heavy nuclides is used in the verification calculations. Shortest half life found in the present burnup chain is approximately 30 s ( 106 Rh). Therefore, conventional methods (e.g., the Taylor series expansion with scaling and squaring) tend to require longer computation time due to numerical stiffness. Comparison with other numerical methods (e.g., the 4-th order Runge-Kutta-Gill) reveals that the Krylov subspace method can provide accurate solution for a detailed burnup chain used in the present study with short computation time. (author)

  13. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  14. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  15. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  16. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  17. A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals

    DEFF Research Database (Denmark)

    Juul, K. J.; Niordson, C. F.; Nielsen, K. L.

    2018-01-01

    -viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical......A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling...

  18. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  19. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  20. Finite element program ARKAS: verification for IAEA benchmark problem analysis on core-wide mechanical analysis of LMFBR cores

    International Nuclear Information System (INIS)

    Nakagawa, M.; Tsuboi, Y.

    1990-01-01

    ''ARKAS'' code verification, with the problems set in the International Working Group on Fast Reactors (IWGFR) Coordinated Research Programme (CRP) on the inter-comparison between liquid metal cooled fast breeder reactor (LMFBR) Core Mechanics Codes, is discussed. The CRP was co-ordinated by the IWGFR around problems set by Dr. R.G. Anderson (UKAEA) and arose from the IWGFR specialists' meeting on The Predictions and Experience of Core Distortion Behaviour (ref. 2). The problems for the verification (''code against code'') and validation (''code against experiment'') were set and calculated by eleven core mechanics codes from nine countries. All the problems have been completed and were solved with the core structural mechanics code ARKAS. Predictions by ARKAS agreed very well with other solutions for the well-defined verification problems. For the validation problems based on Japanese ex-reactor 2-D thermo-elastic experiments, the agreements between measured and calculated values were fairly good. This paper briefly describes the numerical model of the ARKAS code, and discusses some typical results. (author)

  1. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  2. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  3. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  4. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  5. Spontaneous Radiation Background Calculation for LCLS

    CERN Document Server

    Reiche, Sven

    2004-01-01

    The intensity of undulator radiation, not amplified by the FEL interaction, can be larger than the maximum FEL signal in the case of an X-ray FEL. In the commissioning of a SASE FEL it is essential to extract an amplified signal early to diagnose eventual misalignment of undulator modules or errors in the undulator field strength. We developed a numerical code to calculate the radiation pattern at any position behind a multi-segmented undulator with arbitrary spacing and field profiles. The output can be run through numerical spatial and frequency filters to model the radiation beam transport and diagnostic. In this presentation we estimate the expected background signal for the FEL diagnostic and at what point along the undulator the FEL signal can be separated from the background. We also discusses how much information on the undulator field and alignment can be obtained from the incoherent radiation signal itself.

  6. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  7. Modified Truncated Multiplicity Analysis to Improve Verification of Uranium Fuel Cycle Materials

    International Nuclear Information System (INIS)

    LaFleur, A.; Miller, K.; Swinhoe, M.; Belian, A.; Croft, S.

    2015-01-01

    Accurate verification of 235U enrichment and mass in UF6 storage cylinders and the UO2F2 holdup contained in the process equipment is needed to improve international safeguards and nuclear material accountancy at uranium enrichment plants. Small UF6 cylinders (1.5'' and 5'' diameter) are used to store the full range of enrichments from depleted to highly-enriched UF6. For independent verification of these materials, it is essential that the 235U mass and enrichment measurements do not rely on facility operator declarations. Furthermore, in order to be deployed by IAEA inspectors to detect undeclared activities (e.g., during complementary access), it is also imperative that the measurement technique is quick, portable, and sensitive to a broad range of 235U masses. Truncated multiplicity analysis is a technique that reduces the variance in the measured count rates by only considering moments 1, 2, and 3 of the multiplicity distribution. This is especially important for reducing the uncertainty in the measured doubles and triples rates in environments with a high cosmic ray background relative to the uranium signal strength. However, we believe that the existing truncated multiplicity analysis throws away too much useful data by truncating the distribution after the third moment. This paper describes a modified truncated multiplicity analysis method that determines the optimal moment to truncate the multiplicity distribution based on the measured data. Experimental measurements of small UF6 cylinders and UO2F2 working reference materials were performed at Los Alamos National Laboratory (LANL). The data were analyzed using traditional and modified truncated multiplicity analysis to determine the optimal moment to truncate the multiplicity distribution to minimize the uncertainty in the measured count rates. The results from this analysis directly support nuclear safeguards at enrichment plants and provide a more accurate verification method for UF6

  8. Doppler Radar and Analysis for Climate Model Verification and Numerical Weather Prediction

    National Research Council Canada - National Science Library

    Xu, Qin

    1998-01-01

    ... (Qiu and Xu, 1996, Mon. Wea. Rev., 1132-1144). The LS method was further upgraded to including background wind fields and used to improve the initial condition for the ARPS model's short-term prediction...

  9. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  10. Numerical linear algebra with applications using Matlab

    CERN Document Server

    Ford, William

    2014-01-01

    Designed for those who want to gain a practical knowledge of modern computational techniques for the numerical solution of linear algebra problems, Numerical Linear Algebra with Applications contains all the material necessary for a first year graduate or advanced undergraduate course on numerical linear algebra with numerous applications to engineering and science. With a unified presentation of computation, basic algorithm analysis, and numerical methods to compute solutions, this book is ideal for solving real-world problems. It provides necessary mathematical background information for

  11. Background-cross-section-dependent subgroup parameters

    International Nuclear Information System (INIS)

    Yamamoto, Toshihisa

    2003-01-01

    A new set of subgroup parameters was derived that can reproduce the self-shielded cross section against a wide range of background cross sections. The subgroup parameters are expressed with a rational equation which numerator and denominator are expressed as the expansion series of background cross section, so that the background cross section dependence is exactly taken into account in the parameters. The advantage of the new subgroup parameters is that they can reproduce the self-shielded effect not only by group basis but also by subgroup basis. Then an adaptive method is also proposed which uses fitting procedure to evaluate the background-cross-section-dependence of the parameters. One of the simple fitting formula was able to reproduce the self-shielded subgroup cross section by less than 1% error from the precise evaluation. (author)

  12. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  13. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  14. Numerical simulation of seismic performance of the underground structure buried in the dense saturated sand

    International Nuclear Information System (INIS)

    Kawai, Tadashi

    2006-01-01

    The applicability of the advanced earthquake resistant performance verification method on reinforced concrete underground structures developed by CRIEPI was investigated for the structures which buried in the dry sand. For the advancement of the method in practical use, the applicability to the structures buried in the saturated ground is expected to be verified. In this study the applicability of the effective stress based soil modeling method in numerical analysis, which was proposed through the modification of the formerly developed model by CRIEPI, was verified through the non-linear dynamic numerical simulations of the large centrifuge tests conducted by using a model comprised of fully saturated sand and a aluminium duct type structure specially prepared for the measurement of the load acting on the structure surface with the soil-structure interaction. The magnitudes of the simulated loads and the resultant deformations of the structure were almost same as those of experiments. As a result it is confirmed that the performance verification method is useful for the structures buried in the saturated ground with using the proposed effective stress based ground modeling method. (author)

  15. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  16. Numerical investigation on the implications of spring temperature and discharge rate with respect to the geothermal background in a fault zone

    Science.gov (United States)

    Jiang, Zhenjiao; Xu, Tianfu; Mariethoz, Gregoire

    2018-04-01

    Geothermal springs are some of the most obvious indicators of the existence of high-temperature geothermal resources in the subsurface. However, geothermal springs can also occur in areas of low average subsurface temperatures, which makes it difficult to assess exploitable zones. To address this problem, this study quantitatively analyzes the conditions associated with the formation of geothermal springs in fault zones, and numerically investigates the implications that outflow temperature and discharge rate from geothermal springs have on the geothermal background in the subsurface. It is concluded that the temperature of geothermal springs in fault zones is mainly controlled by the recharge rate from the country rock and the hydraulic conductivity in the fault damage zone. Importantly, the topography of the fault trace on the land surface plays an important role in determining the thermal temperature. In fault zones with a permeability higher than 1 mD and a lateral recharge rate from the country rock higher than 1 m3/day, convection plays a dominant role in the heat transport rather than thermal conduction. The geothermal springs do not necessarily occur in the place having an abnormal geothermal background (with the temperature at certain depth exceeding the temperature inferred by the global average continental geothermal gradient of 30 °C/km). Assuming a constant temperature (90 °C here, to represent a normal geothermal background in the subsurface at a depth of 3,000 m), the conditions required for the occurrence of geothermal springs were quantitatively determined.

  17. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  18. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  19. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  20. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  1. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. Verification of the geostatistical inference code INFERENS, Version 1.1, and demonstration using data from Finnsjoen

    International Nuclear Information System (INIS)

    Geier, J.

    1993-06-01

    This report describes preliminary verification and demonstration of the geostatistical inference code, INFERENS Version 1.1. This code performs regularization of packer test conductivities, and iterative generalized least-squares estimation (IGLSE) of nested covariance models and spatial trends for the regularized data. Cross-validation is used to assess the quality of the estimated models in terms of statistics for the kriging errors. The code includes a capability to generate synthetic datasets for a given configuration of packer tests; this capability can be used for verification exercises and numerical experiments to aid in the design of packer testing programs. The report presents the results of a set of verification test cases. The test cases were designed to test the ability of INFERENS 1.1 to estimate the parameters of a variety of covariance models, with or without trends. This was done using synthetic datasets. This report also describes an application of INFERENS 1.1 to the dataset from the Finnsjoen site. The results are roughly similar to those obtained previously by Norman (1992a) using INFERENS 1.0, for the comparable cases. The actual numerical results are different, which may be due to changes in the fitting algorithms, and differences in how the lag pairs are divided into lag classes. The demonstrations confirm the result previously obtained by Norman, that the fitted horizontally isotropic models are less good, in terms of their cross-validation statistics, than the corresponding isotropic models. The use of nested covariance models is demonstrated to give visually improved fits to the sample semivariograms, at both short and long lag distances. However, despite the good match to the semivariograms, the nested models obtained are not better than the simple models, in terms of cross-validation statistics

  3. A new approach for the verification of optical systems

    Science.gov (United States)

    Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène

    2013-09-01

    Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Tra­ditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.

  4. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  6. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  7. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Directory of Open Access Journals (Sweden)

    Byeong Hak Kim

    2017-12-01

    Full Text Available Unmanned aerial vehicles (UAVs are equipped with optical systems including an infrared (IR camera such as electro-optical IR (EO/IR, target acquisition and designation sights (TADS, or forward looking IR (FLIR. However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC and scene-based NUC (SBNUC. However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA. In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR and long wave infrared (LWIR images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  8. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  9. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  10. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  11. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  12. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  13. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  14. CaveMan Enterprise version 1.0 Software Validation and Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Hart, David

    2014-10-01

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prise has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.

  15. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  16. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  17. Random distribution of background charge density for numerical simulation of discharge inception

    International Nuclear Information System (INIS)

    Grange, F.; Loiseau, J.F.; Spyrou, N.

    1998-01-01

    The models of electric streamers based on a uniform background density of electrons may appear not to be physical, as the number of electrons in the small active region located in the vicinity of the electrode tip under regular conditions can be less than one. To avoid this, the electron background is modelled by a random density distribution such that, after a certain time lag, at least one electron is present in the grid close to the point electrode. The modelling performed shows that the streamer inception is not very sensitive to the initial location of the charged particles; the ionizing front, however, may be delayed by several tens of nanoseconds, depending on the way the electron has to drift before reaching the anode. (J.U.)

  18. Profiled Deck Composite Slab Strength Verification: A Review

    Directory of Open Access Journals (Sweden)

    K. Mohammed

    2017-12-01

    Full Text Available The purpose of this article is to present an overview on alternate profiled deck composite slab (PDCS strength verification devoid of the expensive and complex laboratory procedures in establishing its longitudinal shear capacity. Despite the several deterministic research findings leading to the development of proposals and modifications on the complex shear characteristics of PDCS that defines its strength behaviour, the laboratory performance testing stands to be the only accurate means for the PDCS strength assessment. The issue is critical and warrants much further thoughts from different perspective other than the deterministic approach that are rather expensive and time consuming. Hence, the development of a rational-based numerical test load function from longitudinal shear capacity consideration is a necessity in augmenting the previous futile attempts for strength determination of PDCS devoid of the costlier and expensive laboratory procedure.

  19. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  20. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  1. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  2. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  3. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  4. Numerical Analysis of Aerodynamic Characteristics of the Finned Surfaces with Cross-inclined Fins

    Directory of Open Access Journals (Sweden)

    Lagutin A. E.

    2016-12-01

    Full Text Available This paper presents results of numerical research and analyses air-side hydraulic performance of tube bundles with cross inclined fins. The numerical simulation of the fin-tube heat exchanger was performed using the Comsol Femlab software. The results of modeling show the influence of fin inclination angle and tube pitch on hydraulic characteristics of finned surfaces. A series of numerical tests were carried out for tube bundles with different inclination angles (γ =900, 850, 650, 60, the fin pitch u=4 mm. The results indicate that tube bundles with cross inclined fins can significantly enhance the average integral value of the air flow rate in channel between fins in comparison with conventional straight fins. Aerodynamic processes on both sides of modificated channel between inclined fins were analyzed. The verification procedures for received results of numerical modeling with experimental data were performed.

  5. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  6. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  7. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  8. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    Science.gov (United States)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  9. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  10. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  11. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  12. An improved algorithm of laser spot center detection in strong noise background

    Science.gov (United States)

    Zhang, Le; Wang, Qianqian; Cui, Xutai; Zhao, Yu; Peng, Zhong

    2018-01-01

    Laser spot center detection is demanded in many applications. The common algorithms for laser spot center detection such as centroid and Hough transform method have poor anti-interference ability and low detection accuracy in the condition of strong background noise. In this paper, firstly, the median filtering was used to remove the noise while preserving the edge details of the image. Secondly, the binarization of the laser facula image was carried out to extract target image from background. Then the morphological filtering was performed to eliminate the noise points inside and outside the spot. At last, the edge of pretreated facula image was extracted and the laser spot center was obtained by using the circle fitting method. In the foundation of the circle fitting algorithm, the improved algorithm added median filtering, morphological filtering and other processing methods. This method could effectively filter background noise through theoretical analysis and experimental verification, which enhanced the anti-interference ability of laser spot center detection and also improved the detection accuracy.

  13. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  14. Numerical analysis of the Anderson localization

    International Nuclear Information System (INIS)

    Markos, P.

    2006-01-01

    The aim of this paper is to demonstrate, by simple numerical simulations, the main transport properties of disordered electron systems. These systems undergo the metal insulator transition when either Fermi energy crosses the mobility edge or the strength of the disorder increases over critical value. We study how disorder affects the energy spectrum and spatial distribution of electronic eigenstates in the diffusive and insulating regime, as well as in the critical region of the metal-insulator transition. Then, we introduce the transfer matrix and conductance, and we discuss how the quantum character of the electron propagation influences the transport properties of disordered samples. In the weakly disordered systems, the weak localization and anti-localization as well as the universal conductance fluctuation are numerically simulated and discussed. The localization in the one dimensional system is described and interpreted as a purely quantum effect. Statistical properties of the conductance in the critical and localized regimes are demonstrated. Special attention is given to the numerical study of the transport properties of the critical regime and to the numerical verification of the single parameter scaling theory of localization. Numerical data for the critical exponent in the orthogonal models in dimension 2 < d ≤ 5 are compared with theoretical predictions. We argue that the discrepancy between the theory and numerical data is due to the absence of the self-averaging of transmission quantities. This complicates the analytical analysis of the disordered systems. Finally, theoretical methods of description of weakly disordered systems are explained and their possible generalization to the localized regime is discussed. Since we concentrate on the one-electron propagation at zero temperature, no effects of electron-electron interaction and incoherent scattering are discussed in the paper (Author)

  15. Prediction of qualitative parameters of slab steel ingot using numerical modelling

    Directory of Open Access Journals (Sweden)

    M. Tkadlečková

    2016-07-01

    Full Text Available The paper describes the verification of casting and solidification of heavy slab ingot weighing 40 t from tool steel by means of numerical modelling with use of a finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, the problems with determination of the thermodynamic properties of materials and with determination of the heat transfer between the individual parts of the casting system are discussed. The final porosity, macrosegregation and the risk of cracks were predicted. The results allowed us to use the slab ingot instead of the conventional heavy steel ingot and to improve the ratio, the chamfer and the external shape of the wall of the new design of the slab ingot.

  16. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  17. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  18. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  19. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  20. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk [Dept. of Mechanical Engineering, Chung Ang University, Seoul (Korea, Republic of)

    2016-10-15

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed.

  1. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    International Nuclear Information System (INIS)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk

    2016-01-01

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed

  2. Numerical Investigations of Moisture Distribution in a Selected Anisotropic Soil Medium

    Science.gov (United States)

    Iwanek, M.

    2018-01-01

    The moisture of soil profile changes both in time and space and depends on many factors. Changes of the quantity of water in soil can be determined on the basis of in situ measurements, but numerical methods are increasingly used for this purpose. The quality of the results obtained using pertinent software packages depends on appropriate description and parameterization of soil medium. Thus, the issue of providing for the soil anisotropy phenomenon gains a big importance. Although anisotropy can be taken into account in many numerical models, isotopic soil is often assumed in the research process. However, this assumption can be a reason for incorrect results in the simulations of water changes in soil medium. In this article, results of numerical simulations of moisture distribution in the selected soil profile were presented. The calculations were conducted assuming isotropic and anisotropic conditions. Empirical verification of the results obtained in the numerical investigations indicated statistical essential discrepancies for the both analyzed conditions. However, better fitting measured and calculated moisture values was obtained for the case of providing for anisotropy in the simulation model.

  3. Testability of numerical systems

    International Nuclear Information System (INIS)

    Soulas, B.

    1992-01-01

    In order to face up to the growing complexity of systems, the authors undertook to define a new approach for the qualification of systems. This approach is based on the concept of Testability which, supported by system modelization, validation and verification methods and tools, would allow Integrated Qualification process, applied throughout the life-span of systems. The general principles of this approach are introduced in the general case of numerical systems; in particular, this presentation points out the difference between the specification activity and the modelization and validation activity. This approach is illustrated firstly by the study of a global system and then by case of communication protocol as the software point of view. Finally MODEL which support this approach is described. MODEL tool is a commercial tool providing modelization and validation techniques based on Petri Nets with triple extension: Predicate/Transition, Timed and Stochastic Petri Nets

  4. Multicentre validation of IMRT pre-treatment verification: Comparison of in-house and external audit

    International Nuclear Information System (INIS)

    Jornet, Núria; Carrasco, Pablo; Beltrán, Mercè; Calvo, Juan Francisco; Escudé, Lluís; Hernández, Victor; Quera, Jaume; Sáez, Jordi

    2014-01-01

    Background and purpose: We performed a multicentre intercomparison of IMRT optimisation and dose planning and IMRT pre-treatment verification methods and results. The aims were to check consistency between dose plans and to validate whether in-house pre-treatment verification results agreed with those of an external audit. Materials and methods: Participating centres used two mock cases (prostate and head and neck) for the intercomparison and audit. Compliance to dosimetric goals and total number of MU per plan were collected. A simple quality index to compare the different plans was proposed. We compared gamma index pass rates using the centre’s equipment and methodology to those of an external audit. Results: While for the prostate case, all centres fulfilled the dosimetric goals and plan quality was homogeneous, that was not the case for the head and neck case. The number of MU did not correlate with the plan quality index. Pre-treatment verifications results of the external audit did not agree with those of the in-house measurements for two centres: being within tolerance for in-house measurements and unacceptable for the audit or the other way round. Conclusions: Although all plans fulfilled dosimetric constraints, plan quality is highly dependent on the planner expertise. External audits are an excellent tool to detect errors in IMRT implementation and cannot be replaced by intercomparison using results obtained by centres

  5. Industrial hardware and software verification with ACL2.

    Science.gov (United States)

    Hunt, Warren A; Kaufmann, Matt; Moore, J Strother; Slobodova, Anna

    2017-10-13

    The ACL2 theorem prover has seen sustained industrial use since the mid-1990s. Companies that have used ACL2 regularly include AMD, Centaur Technology, IBM, Intel, Kestrel Institute, Motorola/Freescale, Oracle and Rockwell Collins. This paper introduces ACL2 and focuses on how and why ACL2 is used in industry. ACL2 is well-suited to its industrial application to numerous software and hardware systems, because it is an integrated programming/proof environment supporting a subset of the ANSI standard Common Lisp programming language. As a programming language ACL2 permits the coding of efficient and robust programs; as a prover ACL2 can be fully automatic but provides many features permitting domain-specific human-supplied guidance at various levels of abstraction. ACL2 specifications and models often serve as efficient execution engines for the modelled artefacts while permitting formal analysis and proof of properties. Crucially, ACL2 also provides support for the development and verification of other formal analysis tools. However, ACL2 did not find its way into industrial use merely because of its technical features. The core ACL2 user/development community has a shared vision of making mechanized verification routine when appropriate and has been committed to this vision for the quarter century since the Computational Logic, Inc., Verified Stack. The community has focused on demonstrating the viability of the tool by taking on industrial projects (often at the expense of not being able to publish much).This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  6. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  7. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  8. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  9. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  10. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  11. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  12. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  13. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  14. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  15. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  16. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  17. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  18. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  19. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  20. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  2. Numerical verification of B-WIM system using reaction force signals

    International Nuclear Information System (INIS)

    Chang, Sung Jin; Kim, Nam Sik

    2012-01-01

    Bridges are ones of fundamental facilities for roads which become social overhead capital facilities and they are designed to get safety in their life cycles. However as time passes, bridge can be damaged by changes of external force and traffic environments. Therefore, a bridge should be repaired and maintained for extending its life cycle. The working load on a bridge is one of the most important factors for safety, it should be calculated accurately. The most important load among working loads is live load by a vehicle. Thus, the travel characteristics and weight of vehicle can be useful for bridge maintenance if they were estimated with high reliability. In this study, a B-WIM system in which the bridge is used for a scale have been developed for measuring the vehicle loads without the vehicle stop. The vehicle loads can be estimated by the developed B-WIM system with the reaction responses from the supporting points. The algorithm of developed B-WIM system have been verified by numerical analysis

  3. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  4. Numerical semigroups and applications

    CERN Document Server

    Assi, Abdallah

    2016-01-01

    This work presents applications of numerical semigroups in Algebraic Geometry, Number Theory, and Coding Theory. Background on numerical semigroups is presented in the first two chapters, which introduce basic notation and fundamental concepts and irreducible numerical semigroups. The focus is in particular on free semigroups, which are irreducible; semigroups associated with planar curves are of this kind. The authors also introduce semigroups associated with irreducible meromorphic series, and show how these are used in order to present the properties of planar curves. Invariants of non-unique factorizations for numerical semigroups are also studied. These invariants are computationally accessible in this setting, and thus this monograph can be used as an introduction to Factorization Theory. Since factorizations and divisibility are strongly connected, the authors show some applications to AG Codes in the final section. The book will be of value for undergraduate students (especially those at a higher leve...

  5. Arithmetic and algebraic problem solving and resource allocation: the distinct impact of fluid and numerical intelligence.

    Science.gov (United States)

    Dix, Annika; van der Meer, Elke

    2015-04-01

    This study investigates cognitive resource allocation dependent on fluid and numerical intelligence in arithmetic/algebraic tasks varying in difficulty. Sixty-six 11th grade students participated in a mathematical verification paradigm, while pupil dilation as a measure of resource allocation was collected. Students with high fluid intelligence solved the tasks faster and more accurately than those with average fluid intelligence, as did students with high compared to average numerical intelligence. However, fluid intelligence sped up response times only in students with average but not high numerical intelligence. Further, high fluid but not numerical intelligence led to greater task-related pupil dilation. We assume that fluid intelligence serves as a domain-general resource that helps to tackle problems for which domain-specific knowledge (numerical intelligence) is missing. The allocation of this resource can be measured by pupil dilation. Copyright © 2014 Society for Psychophysiological Research.

  6. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  7. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  8. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret; Yu, Yi-Hsiang

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is a follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.

  9. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  10. Numerical Analysis on the Free Fall Motion of the Control Rod Assembly for the Sodium Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Se-Hong; Choi, Choengryul; Son, Sung-Man [ELSOLTEC, Yongin (Korea, Republic of); Kim, Jae-Yong; Yoon, Kyung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    On receiving the scram signal, the control rod assemblies are released to fall into the reactor core by its weight. Thus drop time and falling velocity of the control rod assembly must be estimated for the safety evaluation. However, because of its complex shape, it is difficult to estimate the drop time by theoretical method. In this study, numerical analysis has been carried out in order to estimate drop time and falling velocity of the control rod assembly to provide the underlying data for the design optimization. Numerical analysis has been carried out to estimate the drop time and falling velocity of the control rod assembly for sodium-cooled fast reactor. Before performing the numerical analysis for the control rod assembly, sphere dropping experiment has been carried out for verification of the CFD methodology. The result of the numerical analysis for the method verification is almost same as the result of the experiment. Falling velocity and drag force increase rapidly in the beginning. And then it goes to the stable state. When the piston head of the control rod assembly is inserted into the damper, the drag force increases instantaneously and the falling velocity decreases quickly. The falling velocity is reduced about 14 % by damper. The total drop time of the control rod assembly is about 1.47s. In the next study, the experiment for the control rod assembly will be carried out, and its result is going to be compared with the CFD analysis result.

  11. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  12. Verification of EPA's ''Preliminary Remediation Goals for radionuclides'' (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, Tim [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Stagich, Brooke [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-08-28

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their updated “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides PRGs for radionuclides that are used as a screening tool at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and Resource Conservation and Recovery Act (RCRA) sites. These risk-based PRGs establish concentration limits under specific exposure scenarios. The purpose of this verification study is to determine that the calculator has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly. There are 167 equations used in the calculator. To verify the calculator, all equations for each of seven receptor types (resident, construction worker, outdoor and indoor worker, recreator, farmer, and composite worker) were hand calculated using the default parameters. The same four radionuclides (Am-241, Co-60, H-3, and Pu-238) were used for each calculation for consistency throughout.

  13. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  14. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  15. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  16. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  17. In situ object counting system (ISOCSi3TM) technique: A cost-effective tool for NDA verification in IAEA Safeguards

    International Nuclear Information System (INIS)

    Nizhnik, V.; Belian, A.; Shephard, A.; Lebrun, A.

    2011-01-01

    Nuclear material measurements using the ISOCS technique are playing an increasing role in IAEA verification activities. The ISOCS capabilities include: a high sensitivity to the presence of U and Pu; the ability to detect very small amounts of material; and the ability to measure items of different shapes and sizes. In addition, the numerical absolute efficiency calibration of a germanium detector used in the technique does not require any calibration standards or reference materials. The ISOCS modelling software performs an absolute efficiency calibration for items with various container shapes, container wall materials, material compositions, material fill-heights, U/Pu weight fractions and even heterogeneously distributed emitting materials. In a number of cases, some key parameters, such as the matrix density and U/Pu weight fraction, can be determined in addition to the emitting material mass and isotopic composition. These capabilities provide a verification solution suitable for a majority of cases where quantitative and isotopic analysis should be performed. Taking into account these advantages, the technique becomes a cost-effective solution for nuclear material non-destructive assay (NDA) verification. At present, the IAEA uses the ISOCS for a wide range of applications including the quantitative analysis of U scrap materials, U/Pu contaminated solid wastes, U fuel elements, U hold-up materials. Additionally, the ISOCS is also applied to some specific verification cases such as the measurement of PuBe neutron sources and the quantification of fission products in solid wastes. In reprocessing facilities with U/Pu waste compaction or facilities with item re-batching, the continuity-of-knowledge can be assured by applying either video surveillance systems together with seals (requiring attaching/detaching and verification activities for each seal) or verification of operator declarations using quantitative measurements for items selected on a random basis

  18. Solution verification, goal-oriented adaptive methods for stochastic advection–diffusion problems

    KAUST Repository

    Almeida, Regina C.

    2010-08-01

    A goal-oriented analysis of linear, stochastic advection-diffusion models is presented which provides both a method for solution verification as well as a basis for improving results through adaptation of both the mesh and the way random variables are approximated. A class of model problems with random coefficients and source terms is cast in a variational setting. Specific quantities of interest are specified which are also random variables. A stochastic adjoint problem associated with the quantities of interest is formulated and a posteriori error estimates are derived. These are used to guide an adaptive algorithm which adjusts the sparse probabilistic grid so as to control the approximation error. Numerical examples are given to demonstrate the methodology for a specific model problem. © 2010 Elsevier B.V.

  19. Solution verification, goal-oriented adaptive methods for stochastic advection–diffusion problems

    KAUST Repository

    Almeida, Regina C.; Oden, J. Tinsley

    2010-01-01

    A goal-oriented analysis of linear, stochastic advection-diffusion models is presented which provides both a method for solution verification as well as a basis for improving results through adaptation of both the mesh and the way random variables are approximated. A class of model problems with random coefficients and source terms is cast in a variational setting. Specific quantities of interest are specified which are also random variables. A stochastic adjoint problem associated with the quantities of interest is formulated and a posteriori error estimates are derived. These are used to guide an adaptive algorithm which adjusts the sparse probabilistic grid so as to control the approximation error. Numerical examples are given to demonstrate the methodology for a specific model problem. © 2010 Elsevier B.V.

  20. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  1. Effectiveness of short-term numerical weather prediction in predicting growing degree days and meteorological conditions for apple scab appearance

    Czech Academy of Sciences Publication Activity Database

    Lalic, B.; Francia, M.; Eitzinger, Josef; Podrascanin, Z.; Arsenic, I.

    2016-01-01

    Roč. 23, č. 1 (2016), s. 50-56 ISSN 1350-4827 Institutional support: RVO:86652079 Keywords : venturia-inaequalis * temperature * equation * schemes * model * numerical weather prediction * disease prediction * verification * apple scab * growing degree days Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences Impact factor: 1.411, year: 2016

  2. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  3. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  4. Numerical linear algebra theory and applications

    CERN Document Server

    Beilina, Larisa; Karchevskii, Mikhail

    2017-01-01

    This book combines a solid theoretical background in linear algebra with practical algorithms for numerical solution of linear algebra problems. Developed from a number of courses taught repeatedly by the authors, the material covers topics like matrix algebra, theory for linear systems of equations, spectral theory, vector and matrix norms combined with main direct and iterative numerical methods, least squares problems, and eigen problems. Numerical algorithms illustrated by computer programs written in MATLAB® are also provided as supplementary material on SpringerLink to give the reader a better understanding of professional numerical software for the solution of real-life problems. Perfect for a one- or two-semester course on numerical linear algebra, matrix computation, and large sparse matrices, this text will interest students at the advanced undergraduate or graduate level.

  5. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  6. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  7. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  8. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  9. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  10. Average-case analysis of numerical problems

    CERN Document Server

    2000-01-01

    The average-case analysis of numerical problems is the counterpart of the more traditional worst-case approach. The analysis of average error and cost leads to new insight on numerical problems as well as to new algorithms. The book provides a survey of results that were mainly obtained during the last 10 years and also contains new results. The problems under consideration include approximation/optimal recovery and numerical integration of univariate and multivariate functions as well as zero-finding and global optimization. Background material, e.g. on reproducing kernel Hilbert spaces and random fields, is provided.

  11. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Science.gov (United States)

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  12. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  13. A Readout Integrated Circuit (ROIC) employing self-adaptive background current compensation technique for Infrared Focal Plane Array (IRFPA)

    Science.gov (United States)

    Zhou, Tong; Zhao, Jian; He, Yong; Jiang, Bo; Su, Yan

    2018-05-01

    A novel self-adaptive background current compensation circuit applied to infrared focal plane array is proposed in this paper, which can compensate the background current generated in different conditions. Designed double-threshold detection strategy is to estimate and eliminate the background currents, which could significantly reduce the hardware overhead and improve the uniformity among different pixels. In addition, the circuit is well compatible to various categories of infrared thermo-sensitive materials. The testing results of a 4 × 4 experimental chip showed that the proposed circuit achieves high precision, wide application and high intelligence. Tape-out of the 320 × 240 readout circuit, as well as the bonding, encapsulation and imaging verification of uncooled infrared focal plane array, have also been completed.

  14. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  15. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  16. Visual search in the real world: Color vision deficiency affects peripheral guidance, but leaves foveal verification largely unaffected

    Directory of Open Access Journals (Sweden)

    Günter eKugler

    2015-12-01

    Full Text Available Background: People with color vision deficiencies report numerous limitations in daily life. However, they use basic color terms systematically and in a similar manner as people with people with normal color vision. We hypothesize that a possible explanation for this discrepancy between color perception and behavioral consequences might be found in the gaze behavior of people with color vision deficiency.Methods: A group of participants with color vision deficiencies and a control group performed several search tasks in a naturalistic setting on a lawn.Results: Search performance was similar in both groups in a color-unrelated search task as well as in a search for yellow targets. While searching for red targets, color vision deficient participants exhibited a strongly degraded performance. This was closely matched by the number of fixations on red objects shown by the two groups. Importantly, once they fixated a target, participants with color vision deficiencies exhibited only few identification errors. Conclusions: Participants with color vision deficiencies are not able to enhance their search for red targets on a (green lawn by an efficient guiding mechanism. The data indicate that the impaired guiding is the main influence on search performance, while foveal identification (verification largely unaffected.

  17. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  18. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  19. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  20. Numerical solution of matrix exponential in burn-up equation using mini-max polynomial approximation

    International Nuclear Information System (INIS)

    Kawamoto, Yosuke; Chiba, Go; Tsuji, Masashi; Narabayashi, Tadashi

    2015-01-01

    Highlights: • We propose a new numerical solution of matrix exponential in burn-up depletion calculations. • The depletion calculation with extremely short half-lived nuclides can be done numerically stable with this method. • The computational time is shorter than the other conventional methods. - Abstract: Nuclear fuel burn-up depletion calculations are essential to compute the nuclear fuel composition transition. In the burn-up calculations, the matrix exponential method has been widely used. In the present paper, we propose a new numerical solution of the matrix exponential, a Mini-Max Polynomial Approximation (MMPA) method. This method is numerically stable for burn-up matrices with extremely short half-lived nuclides as the Chebyshev Rational Approximation Method (CRAM), and it has several advantages over CRAM. We also propose a multi-step calculation, a computational time reduction scheme of the MMPA method, which can perform simultaneously burn-up calculations with several time periods. The applicability of these methods has been theoretically and numerically proved for general burn-up matrices. The numerical verification has been performed, and it has been shown that these methods have high precision equivalent to CRAM

  1. Uncertain Portfolio Selection with Background Risk and Liquidity Constraint

    Directory of Open Access Journals (Sweden)

    Jia Zhai

    2017-01-01

    Full Text Available This paper discusses an uncertain portfolio selection problem with consideration of background risk and asset liquidity. In addition, the transaction costs are also considered. The security returns, background asset return, and asset liquidity are estimated by experienced experts instead of historical data. Regarding them as uncertain variables, a mean-risk model with background risk, liquidity, and transaction costs is proposed for portfolio selection and the crisp forms of the model are provided when security returns obey different uncertainty distributions. Moreover, for better understanding of the impact of background risk and liquidity on portfolio selection, some important theorems are proved. Finally, numerical experiments are presented to illustrate the modeling idea.

  2. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  3. Lessons learned in the verification, validation and application of a coupled heat and fluid flow code

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1986-01-01

    A summary is given of the authors recent studies in the verification, validation and application of a coupled heat and fluid flow code. Verification has been done against eight analytic and semi-analytic solutions. These solutions include those involving thermal buoyancy flow and fracture flow. Comprehensive field validation studies over a period of four years are discussed. The studies are divided into three stages: (1) history matching, (2) double-blind prediction and confirmation, (3) design optimization. At each stage, parameter sensitivity studies are performed. To study the applications of mathematical models, a problem proposed by the International Energy Agency (IEA) is solved using this verified and validated numerical model as well as two simpler models. One of the simpler models is a semi-analytic method assuming the uncoupling of the heat and fluid flow processes. The other is a graphical method based on a large number of approximations. Variations are added to the basic IEA problem to point out the limits of ranges of applications of each model. A number of lessons are learned from the above investigations. These are listed and discussed

  4. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  5. Interface COMSOL-PHREEQC (iCP), an efficient numerical framework for the solution of coupled multiphysics and geochemistry

    Science.gov (United States)

    Nardi, Albert; Idiart, Andrés; Trinchero, Paolo; de Vries, Luis Manuel; Molinero, Jorge

    2014-08-01

    This paper presents the development, verification and application of an efficient interface, denoted as iCP, which couples two standalone simulation programs: the general purpose Finite Element framework COMSOL Multiphysics® and the geochemical simulator PHREEQC. The main goal of the interface is to maximize the synergies between the aforementioned codes, providing a numerical platform that can efficiently simulate a wide number of multiphysics problems coupled with geochemistry. iCP is written in Java and uses the IPhreeqc C++ dynamic library and the COMSOL Java-API. Given the large computational requirements of the aforementioned coupled models, special emphasis has been placed on numerical robustness and efficiency. To this end, the geochemical reactions are solved in parallel by balancing the computational load over multiple threads. First, a benchmark exercise is used to test the reliability of iCP regarding flow and reactive transport. Then, a large scale thermo-hydro-chemical (THC) problem is solved to show the code capabilities. The results of the verification exercise are successfully compared with those obtained using PHREEQC and the application case demonstrates the scalability of a large scale model, at least up to 32 threads.

  6. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  7. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  8. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  9. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  10. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  11. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  12. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  13. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  14. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  15. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  16. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  17. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  18. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  19. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  1. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.

  2. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  3. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  4. Numerical simulation of turbulent liquid metal flows in plane channels and annuli

    International Nuclear Information System (INIS)

    Groetzbach, G.

    1980-06-01

    The method of direct numerical simulation is used to study heat transfer and statistical data for fully developed turbulent liquid metal flows in plane channels and annuli. Subgrid scale models using one transport equation account for the high wave-number turbulence not resolved by the finite difference grid. A special subgrid-scale heat flux model is deduced together with an approximative theory to calculate all model coefficients. This model can be applied on the total Peclet number range of technical liquid metal flows. Especially it can be used for very small Peclet numbers, where the results are independent on model parameters. A verification of the numerical results for liquid sodium and mercury flows is undertaken by the Nusselt number in plane channels and radial temperature and eddy conductivity profiles for annuli. The numerically determined Nusselt numbers for annuli indicate that many empirical correlations overestimate the influence of the ratio of radii. The numerical results for the eddy conductivity profiles may be used to remove these problems. The statistical properties of the simulated temperature fluctuations are within the wide scatter-band of experimental data. The numerical results give reasonable heat flux correlation coefficients which depend only weakly on the problem marking parameters. (orig.) [de

  5. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  6. Improved features of MARS 1.4 and verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Don; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-09-01

    MARS 1.4 code has been developed as a basic code frame for multi-dimensional thermal-hydraulic analysis of light water reactor transients. This report describes the newly improved features of MARS 1.4 and their verification results. The new features of MARS 1.4 include the implementation of point kinetics model in the 3D module, the coupled heat structure model, the extension of control functions and input check functions in the 3D module, the implementation of new features of RELAP5/MOD3.2.2 -version, the addition of automatic initialization function for fuel 3-D analysis and the unification of material properties and forcing functions, etc. These features have been implemented in the code in order to extend the code modeling capability and to enhance the user friendliness. Among these features, this report describes the implementation of new features of RELAP5/MOD3.3.3-version such as reflood model and critical heat flux models, etc., the automatic initialization function, the unification of material properties and forcing functions and the other code improvements and error corrections, which were not reported in the previous report. Through the verification calculations, the new features of MARS 1.4 have been verified well implemented in the code. In conclusion, MARS 1.4 code has been developed and verified as implemented in the code. In conclusion, MARS 1.4 code has been developed and verified as a multi-dimensional system thermal-hydraulic analysis tool. And, it can play its role as a basic code frame for the future development of a multi-purpose consolidated code, MARS 2.x, for coupled analysis of multi-dimensional system thermal hydraulics, 3D core kinetics, core CHF and containment as well as for further improvement of thermal-hydraulic and numerical models. 4 refs., 10 figs. (Author)

  7. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  8. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  9. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  10. Numerical simulation of MHD flows in inhomogeneous and instationary magnetic fields; Numerische Simulation von MHD-Stroemungen in inhomogenen und instationaeren Magnetfeldern

    Energy Technology Data Exchange (ETDEWEB)

    Ehrhard, Sebastian

    2016-07-01

    In this work, I develop a numerical model for magnetohydrodynamic flows in unsteady an inhomogeneous flow. The model is implemented in the finite-volume based CFD-code OpenFOAM. Some verification and validation tests are made on several standard problems of magnetohydrodynamics. Finally I successful modelled an electromagnetic flowmeter with the code.

  11. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  12. Five-equation and robust three-equation methods for solution verification of large eddy simulation

    Science.gov (United States)

    Dutta, Rabijit; Xing, Tao

    2018-02-01

    This study evaluates the recently developed general framework for solution verification methods for large eddy simulation (LES) using implicitly filtered LES of periodic channel flows at friction Reynolds number of 395 on eight systematically refined grids. The seven-equation method shows that the coupling error based on Hypothesis I is much smaller as compared with the numerical and modeling errors and therefore can be neglected. The authors recommend five-equation method based on Hypothesis II, which shows a monotonic convergence behavior of the predicted numerical benchmark ( S C ), and provides realistic error estimates without the need of fixing the orders of accuracy for either numerical or modeling errors. Based on the results from seven-equation and five-equation methods, less expensive three and four-equation methods for practical LES applications were derived. It was found that the new three-equation method is robust as it can be applied to any convergence types and reasonably predict the error trends. It was also observed that the numerical and modeling errors usually have opposite signs, which suggests error cancellation play an essential role in LES. When Reynolds averaged Navier-Stokes (RANS) based error estimation method is applied, it shows significant error in the prediction of S C on coarse meshes. However, it predicts reasonable S C when the grids resolve at least 80% of the total turbulent kinetic energy.

  13. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  14. In Situ Object Counting System (ISOCS) Technique: Cost-Effective Tool for NDA Verification in IAEA Safeguards

    International Nuclear Information System (INIS)

    Braverman, E.; Lebrun, A.; Nizhnik, V.; Rorif, F.

    2010-01-01

    Uranium materials measurements using the ISOCS technique play an increasing role in IAEA verification activities. This methodology provides high uranium/plutonium sensitivity and a low detection limit together with the capability to measure items with different shapes and sizes. In addition, the numerical absolute efficiency calibration of a germanium detector which is used by the technique does not require any calibration standards or reference materials. ISOCS modelling software allows performing absolute efficiency calibration for items of arbitrary container shape and wall material, matrix chemical composition, material fill-height, uranium or plutonium weight fraction inside the matrix and even nuclear material/matrix non-homogeneous distribution. Furthermore, in a number of cases, some key parameters such as matrix density and U/Pu weight fraction can be determined along with analysis of nuclear material mass and isotopic composition. These capabilities provide a verification solution suitable for a majority of cases where quantitative and isotopic analysis should be performed. Today, the basic tool for uranium and plutonium mass measurement used in Safeguards verification activities is the neutron counting technique which employs neutron coincidence and multiplicity counters. In respect to the neutron counting technique, ISOCS calibrated detectors have relatively low cost. Taking into account its advantages, this methodology becomes a cost-effective solution for nuclear material NDA verification. At present, the Agency uses ISOCS for quantitative analysis in a wide range of applications: - Uranium scrap materials; - Uranium contaminated solid wastes; - Uranium fuel elements; - Some specific verification cases like measurement of Pu-Be neutron sources, quantification of fission products in solid wastes etc. For uranium hold-up measurements, ISOCS the only available methodology for quantitative and isotopic composition analysis of nuclear materials deposited

  15. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  16. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  17. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    Science.gov (United States)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  18. Modelling of cardiovascular system: development of a hybrid (numerical-physical) model.

    Science.gov (United States)

    Ferrari, G; Kozarski, M; De Lazzari, C; Górczyńska, K; Mimmo, R; Guaragno, M; Tosti, G; Darowski, M

    2003-12-01

    Physical models of the circulation are used for research, training and for testing of implantable active and passive circulatory prosthetic and assistance devices. However, in comparison with numerical models, they are rigid and expensive. To overcome these limitations, we have developed a model of the circulation based on the merging of a lumped parameter physical model into a numerical one (producing therefore a hybrid). The physical model is limited to the barest essentials and, in this application, developed to test the principle, it is a windkessel representing the systemic arterial tree. The lumped parameters numerical model was developed in LabVIEW environment and represents pulmonary and systemic circulation (except the systemic arterial tree). Based on the equivalence between hydraulic and electrical circuits, this prototype was developed connecting the numerical model to an electrical circuit--the physical model. This specific solution is valid mainly educationally but permits the development of software and the verification of preliminary results without using cumbersome hydraulic circuits. The interfaces between numerical and electrical circuits are set up by a voltage controlled current generator and a voltage controlled voltage generator. The behavior of the model is analyzed based on the ventricular pressure-volume loops and on the time course of arterial and ventricular pressures and flow in different circulatory conditions. The model can represent hemodynamic relationships in different ventricular and circulatory conditions.

  19. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  20. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  1. Measurement techniques for the verification of excess weapons materials

    International Nuclear Information System (INIS)

    Tape, J.W.; Eccleston, G.W.; Yates, M.A.

    1998-01-01

    The end of the superpower arms race has resulted in an unprecedented reduction in stockpiles of deployed nuclear weapons. Numerous proposals have been put forward and actions have been taken to ensure the irreversibility of nuclear arms reductions, including unilateral initiatives such as those made by President Clinton in September 1993 to place fissile materials no longer needed for a deterrent under international inspection, and bilateral and multilateral measures currently being negotiated. For the technologist, there is a unique opportunity to develop the technical means to monitor nuclear materials that have been declared excess to nuclear weapons programs, to provide confidence that reductions are taking place and that the released materials are not being used again for nuclear explosive programs. However, because of the sensitive nature of these materials, a fundamental conflict exists between the desire to know that the bulk materials or weapon components in fact represent evidence of warhead reductions, and treaty commitments and national laws that require the protection of weapons design information. This conflict presents a unique challenge to technologists. The flow of excess weapons materials, from deployed warheads through storage, disassembly, component storage, conversion to bulk forms, and disposition, will be described in general terms. Measurement approaches based on the detection of passive or induced radiation will be discussed along with the requirement to protect sensitive information from release to unauthorized parties. Possible uses of measurement methods to assist in the verification of arms reductions will be described. The concept of measuring attributes of items rather than quantitative mass-based inventory verification will be discussed along with associated information-barrier concepts required to protect sensitive information

  2. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  3. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  4. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  5. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  6. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    Science.gov (United States)

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  7. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  8. Experimental quantum verification in the presence of temporally correlated noise

    Science.gov (United States)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  9. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  10. Numerical prediction and measurement of optoacoustic signals generated in PVA-H tissue phantoms

    Science.gov (United States)

    Melchert, Oliver; Blumenröther, Elias; Wollweber, Merve; Roth, Bernhard

    2018-01-01

    We present numerical simulations of optoacoustic (OA) signals, complementing laboratory experiments on melanin doped polyvinyl alcohol hydrogel (PVA-H) tissue phantoms. We review the computational approach to model the underlying mechanisms, i.e. optical absorption of laser energy and acoustic propagation of mechanical stress, geared toward experiments that involve absorbing media with homogeneous acoustic properties. We apply the numerical procedure to predict signals observed in the acoustic near- and farfield in both, forward and backward detection mode, in PVA-H tissue phantoms (i.e. an elastic solid). Further, we report on verification tests of our research code based on OA experiments on dye solution (i.e. a liquid) detailed in the literature and benchmark our 3D procedure via limiting cases described in terms of effectively 1D theoretical approaches.

  11. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  12. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  13. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  14. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  15. Verification of some numerical models for operationally predicting mesoscale winds aloft

    International Nuclear Information System (INIS)

    Cornett, J.S.; Randerson, D.

    1977-01-01

    Four numerical models are described for predicting mesoscale winds aloft for a 6 h period. These models are all tested statistically against persistence as the control forecast and against predictions made by operational forecasters. Mesoscale winds aloft data were used to initialize the models and to verify the predictions on an hourly basis. The model yielding the smallest root-mean-square vector errors (RMSVE's) was the one based on the most physics which included advection, ageostrophic acceleration, vertical mixing and friction. Horizontal advection was found to be the most important term in reducing the RMSVE's followed by ageostrophic acceleration, vertical advection, surface friction and vertical mixing. From a comparison of the mean absolute errors based on up to 72 independent wind-profile predictions made by operational forecasters, by the most complete model, and by persistence, we conclude that the model is the best wind predictor in the free air. In the boundary layer, the results tend to favor the forecaster for direction predictions. The speed predictions showed no overall superiority in any of these three models

  16. Numerical aspects of the modelling of the local effects of a high level waste repository

    International Nuclear Information System (INIS)

    Ferreri, J.C.; Ventura, M.A.

    1985-01-01

    The numerical approximations adapted for the development of the computational models for the prediction of the effects of the emplacement of a high level waste repository are reviewed. The problems considered include: the thermal history of the rocky mass constituting the burial media, the flow of underground water and the associated migration of radionuclides in the same media. Results associated with verification of the implemented codes are presented. Their limitations and advantages are discussed. (Author) [es

  17. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  18. Simplified numerical model for predicting onset of flow instability in parallel heated channels

    International Nuclear Information System (INIS)

    Noura Rassoul; El-Khider Si-Ahmed; Tewfik Hamidouche; Anis Bousbia-Salah

    2005-01-01

    Full text of publication follows: Flow instabilities are undesirable phenomena in heated channels since change in flow rate affects the local heat transfer characteristics and may results in premature burnout. For instance, two-phase flow excursion (Ledinegg) instability in boiling channels is of great concern in the design and operation of numerous practical systems especially the MTR fuel type Research Reactors. For heated parallel channels, the negative-sloped segment of the pressure drop-flow rate characteristics (demand curve) of a boiling channel becomes negative. Such instability can lead to significant reduction in channel flow, thereby causing premature burnout of the heated channel before the CHF point. Furthermore, as a consequence of this flow decrease, different types of flow instabilities that may appear can also induce (density wave) flow oscillations of constant amplitude or diverging amplitude. The present work focuses on a numerical simulation of pressure drop in forced convection boiling in vertical narrow and parallel uniformly heated channels. The objective is to determine the point of Onset of flow instability by varying input flow rate without any consideration to density wave oscillations. By the way, the axial void distribution is provided. The numerical model is based on the finite difference method which transform the partial differential conservation equations of Mass, Momentum and Energy, in algebraic equations. Closure relationships as the drift flux model and other constitutive equations are considered to determine the channel pressure drop under steady state boiling conditions. The model validation is performed by confronting the calculations with the Oak Ridge National Laboratory Thermal Hydraulic Test Loop (THTL) experimental data set. Further verification of this model is performed by code-to code verification using the results of RELAP5/Mod 3.2 code. (authors)

  19. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  20. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  1. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  2. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  3. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC registered films. A semiquantitative comparison

    International Nuclear Information System (INIS)

    Geyer, P.; Blank, H.; Alheit, H.

    2006-01-01

    Background and Purpose: the suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. Material and Methods: the comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC registered film as well, using the EC-L registered cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. Results: in general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. Conclusion: the ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures. (orig.)

  4. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  5. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  6. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  7. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications

    Directory of Open Access Journals (Sweden)

    Kobayashi Hiroki

    2012-04-01

    Full Text Available Abstract Background Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. Results We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system. As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. Conclusions This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  8. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  9. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  10. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  11. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  12. Numerical simulation of water quality in Yangtze Estuary

    Directory of Open Access Journals (Sweden)

    Xi Li

    2009-12-01

    Full Text Available In order to monitor water quality in the Yangtze Estuary, water samples were collected and field observation of current and velocity stratification was carried out using a shipboard acoustic Doppler current profiler (ADCP. Results of two representative variables, the temporal and spatial variation of new point source sewage discharge as manifested by chemical oxygen demand (COD and the initial water quality distribution as manifested by dissolved oxygen (DO, were obtained by application of the Environmental Fluid Dynamics Code (EFDC with solutions for hydrodynamics during tides. The numerical results were compared with field data, and the field data provided verification of numerical application: this numerical model is an effective tool for water quality simulation. For point source discharge, COD concentration was simulated with an initial value in the river of zero. The simulated increments and distribution of COD in the water show acceptable agreement with field data. The concentration of DO is much higher in the North Branch than in the South Branch due to consumption of oxygen in the South Branch resulting from discharge of sewage from Shanghai. The DO concentration is greater in the surface layer than in the bottom layer. The DO concentration is low in areas with a depth of less than 20 m, and high in areas between the 20-m and 30-m isobaths. It is concluded that the numerical model is valuable in simulation of water quality in the case of specific point source pollutant discharge. The EFDC model is also of satisfactory accuracy in water quality simulation of the Yangtze Estuary.

  13. Elasto-plastic benchmark calculations. Step 1: verification of the numerical accuracy of the computer programs

    International Nuclear Information System (INIS)

    Corsi, F.

    1985-01-01

    In connection with the design of nuclear reactors components operating at elevated temperature, design criteria need a level of realism in the prediction of inelastic structural behaviour. This concept leads to the necessity of developing non linear computer programmes, and, as a consequence, to the problems of verification and qualification of these tools. Benchmark calculations allow to carry out these two actions, involving at the same time an increased level of confidence in complex phenomena analysis and in inelastic design calculations. With the financial and programmatic support of the Commission of the European Communities (CEE) a programme of elasto-plastic benchmark calculations relevant to the design of structural components for LMFBR has been undertaken by those Member States which are developing a fast reactor project. Four principal progressive aims were initially pointed out that brought to the decision to subdivide the Benchmark effort in a calculations series of four sequential steps: step 1 to 4. The present document tries to summarize Step 1 of the Benchmark exercise, to derive some conclusions on Step 1 by comparison of the results obtained with the various codes and to point out some concluding comments on the first action. It is to point out that even if the work was designed to test the capabilities of the computer codes, another aim was to increase the skill of the users concerned

  14. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  15. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  16. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  17. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  18. A numerical simulation of pre-big bang cosmology

    CERN Document Server

    Maharana, J P; Veneziano, Gabriele

    1998-01-01

    We analyse numerically the onset of pre-big bang inflation in an inhomogeneous, spherically symmetric Universe. Adding a small dilatonic perturbation to a trivial (Milne) background, we find that suitable regions of space undergo dilaton-driven inflation and quickly become spatially flat ($\\Omega \\to 1$). Numerical calculations are pushed close enough to the big bang singularity to allow cross checks against previously proposed analytic asymptotic solutions.

  19. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  20. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  1. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  2. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  3. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  4. Cosmic microwave background bispectrum from recombination.

    Science.gov (United States)

    Huang, Zhiqi; Vernizzi, Filippo

    2013-03-08

    We compute the cosmic microwave background temperature bispectrum generated by nonlinearities at recombination on all scales. We use CosmoLib2nd, a numerical Boltzmann code at second order to compute cosmic microwave background bispectra on the full sky. We consistently include all effects except gravitational lensing, which can be added to our result using standard methods. The bispectrum is peaked on squeezed triangles and agrees with the analytic approximation in the squeezed limit at the few percent level for all the scales where this is applicable. On smaller scales, we recover previous results on perturbed recombination. For cosmic-variance limited data to l(max)=2000, its signal-to-noise ratio is S/N=0.47, corresponding to f(NL)(eff)=-2.79, and will bias a local signal by f(NL)(loc) ~/= 0.82.

  5. Multicomponent mass transport model: theory and numerical implementation (discrete-parcel-random-walk version)

    International Nuclear Information System (INIS)

    Ahlstrom, S.W.; Foote, H.P.; Arnett, R.C.; Cole, C.R.; Serne, R.J.

    1977-05-01

    The Multicomponent Mass Transfer (MMT) Model is a generic computer code, currently in its third generation, that was developed to predict the movement of radiocontaminants in the saturated and unsaturated sediments of the Hanford Site. This model was designed to use the water movement patterns produced by the unsaturated and saturated flow models coupled with dispersion and soil-waste reaction submodels to predict contaminant transport. This report documents the theorical foundation and the numerical solution procedure of the current (third) generation of the MMT Model. The present model simulates mass transport processes using an analog referred to as the Discrete-Parcel-Random-Walk (DPRW) algorithm. The basic concepts of this solution technique are described and the advantages and disadvantages of the DPRW scheme are discussed in relation to more conventional numerical techniques such as the finite-difference and finite-element methods. Verification of the numerical algorithm is demonstrated by comparing model results with known closed-form solutions. A brief error and sensitivity analysis of the algorithm with respect to numerical parameters is also presented. A simulation of the tritium plume beneath the Hanford Site is included to illustrate the use of the model in a typical application. 32 figs

  6. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  7. Experimental and Numerical Analysis of Hull Girder Vibrations and Bow Impact of a Large Ship Sailing in Waves

    Directory of Open Access Journals (Sweden)

    Jialong Jiao

    2015-01-01

    Full Text Available It is of great importance to evaluate the hull structural vibrations response of large ships in extreme seas. Studies of hydroelastic response of an ultra large ship have been conducted with comparative verification between experimental and numerical methods in order to estimate the wave loads response considering hull vibration and water impact. A segmented self-propelling model with steel backbone system was elaborately designed and the experiments were performed in a tank. Time domain numerical simulations of the ship were carried out by using three-dimensional nonlinear hydroelasticity theory. The results from the computational analyses have been correlated with those from model tests.

  8. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  9. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  10. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  11. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  12. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  13. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  14. Supplementation of Flow Accelerated Corrosion Prediction Program Using Numerical Analysis Technique

    International Nuclear Information System (INIS)

    Hwang, Kyeong Mo; Jin, Tae Eun; Park, Won; Oh, Dong Hoon

    2010-01-01

    Flow-accelerated corrosion (FAC) leads to thinning of steel pipe walls that are exposed to flowing water or wet steam. From experience, it is seen that FAC damage to piping at fossil and nuclear plants can result in outages that require expensive repairs and can affect plant reliability and safety. CHECWORKS have been utilized in domestic nuclear plants as a predictive tool to assist FAC engineers in planning inspections and evaluating the inspection data so that piping failures caused by FAC can be prevented. However, CHECWORKS may be occasionally ignore local susceptible portions when predicting FAC damage in a group of pipelines after constructing a database for all the secondary side piping in nuclear plants. This paper describes the methodologies that can complement CHECWORKS and the verifications of CHECWORKS prediction results using numerical analysis. FAC susceptible locations determined using CHECWORKS for two pipeline groups of a nuclear plant was compared with determined using the numerical-analysis-based FLUENT

  15. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  16. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  17. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  18. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  19. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  20. Time-space modal logic for verification of bit-slice circuits

    Science.gov (United States)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  1. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  2. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  3. Development of a computerized portal verification scheme for pelvic treatment fields

    International Nuclear Information System (INIS)

    Nie, K.; Yin, F.-F.; Gao, Q.; Brasacchio, R.

    1996-01-01

    different scales. The result identified for pelvic brim from the initial larger searching range might not be desirable due to the influence of noises or very low contrast at the middle of the pelvic structure. Therefore, the feature identified from the initial snake operation was fit to a polynomial function to generate a new snake template as the second approximation of the pelvic bone. Finally a feature-weighted Chamfer matching method is applied to correlate anatomical features between reference and portal images in order to provide the quantitative assessment of patient setup. Results: The elliptic Fourier transformation method is shown to be an accurate and efficient method to verify the field setup. For a typical pelvic field edge, various shifts and rotations are simulated. A set of measures are then calculated for each shift or rotation and compared to those calculated from original field edge. This comparison demonstrated a 100% accuracy. The pyramid search using double-snake model has been tested with both clinical simulation and portal images. Results indicated that the deviation of the computer-identified pelvic brim from the actual one was within 2 mm. The curve fitting method was effective because of the strong noise form the background, such as the bladder and rectum marker in the simulation image. The technique has shown to be very promising for the extraction of pelvic features. Conclusions: A fully computerized portal verification scheme is being developed for the radiation therapy of pelvic fields. The preliminary results show that the techniques developed here are potentially useful for clinical applications

  4. SoC Design Approach Using Convertibility Verification

    Directory of Open Access Journals (Sweden)

    Basu Samik

    2008-01-01

    Full Text Available Abstract Compositional design of systems on chip from preverified components helps to achieve shorter design cycles and time to market. However, the design process is affected by the issue of protocol mismatches, where two components fail to communicate with each other due to protocol differences. Convertibility verification, which involves the automatic generation of a converter to facilitate communication between two mismatched components, is a collection of techniques to address protocol mismatches. We present an approach to convertibility verification using module checking. We use Kripke structures to represent protocols and the temporal logic to describe desired system behavior. A tableau-based converter generation algorithm is presented which is shown to be sound and complete. We have developed a prototype implementation of the proposed algorithm and have used it to verify that it can handle many classical protocol mismatch problems along with SoC problems. The initial idea for -based convertibility verification was presented at SLA++P '07 as presented in the work by Roopak Sinha et al. 2008.

  5. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  6. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  7. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  8. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    Energy Technology Data Exchange (ETDEWEB)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.

  9. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  10. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  11. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  12. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  13. Measurements of Worldwide Radioxenon Backgrounds - The "EU" Project

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, Ted W.; Cooper, Matthew W.; Hayes, James C.; Forrester, Joel B.; Haas, Derek A.; Hansen, Randy R.; Keller, Paul E.; Kirkham, Randy R.; Lidey, Lance S.; McIntyre, Justin I.; Miley, Harry S.; Payne, Rosara F.; Saey, Paul R.; Thompson, Robert C.; Woods, Vincent T.; Williams, Richard M.

    2009-09-24

    Under the Comprehensive Nuclear-Test-Ban Treaty (CTBT), radioactive xenon (radioxenon) measurements are one of the principle techniques used to detect nuclear underground nuclear explosions, and specifically, the presence of one or more radioxenon isotopes allows one to determine whether a suspected event was a nuclear explosion or originated from an innocent source. During the design of the International Monitoring System (IMS), which was designed as the verification mechanism for the Treaty, it was determined that radioxenon measurements should be performed at 40 or more stations worldwide. At the time of the design of the IMS, however, very few details about the background of the xenon isotopes was known and it is now recognized that the backgrounds were probably evolving anyhow. This paper lays out the beginning of a study of the worldwide concentrations of xenon isotopes that can be used to detect nuclear explosions and several sources that also release radioxenons, and will have to be accounted for during analysis of atmospheric levels. Although the global concentrations of the xenon isotopes are the scope of a much larger activity that could span over several years, this study measures radioxenon concentrations in locations where there was either very little information or there was a unique opportunity to learn more about emissions from known sources. The locations where radioxenon levels were measured and reported are included.

  14. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  15. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  16. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  17. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  18. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  19. Is identity per se irrelevant? A contrarian view of self-verification effects

    OpenAIRE

    Gregg, Aiden P.

    2008-01-01

    Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby ...

  20. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  1. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Science.gov (United States)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  2. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  3. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  4. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  5. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  6. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  7. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...

  8. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  9. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  10. China's numerical management system for reducing national energy intensity

    International Nuclear Information System (INIS)

    Li, Huimin; Zhao, Xiaofan; Yu, Yuqing; Wu, Tong; Qi, Ye

    2016-01-01

    In China, the national target for energy intensity reduction, when integrated with target disaggregation and information feedback systems, constitutes a numerical management system, which is a hallmark of modern governance. This paper points out the technical weaknesses of China's current numerical management system. In the process of target disaggregation, the national target cannot be fully disaggregated to local governments, sectors and enterprises without omissions. At the same time, governments at lower levels face pressure for reducing energy intensity that exceeds their respective jurisdictions. In the process of information feedback, information failure is inevitable due to statistical inaccuracy. Furthermore, the monitoring system is unable to correct all errors, and data verification plays a limited role in the examination system. To address these problems, we recommend that the government: use total energy consumption as the primary indicator of energy management; reform the accounting and reporting of energy statistics toward greater consistency, timeliness and transparency; clearly define the responsibility of the higher levels of government. - Highlights: •We assess drawbacks of China's numerical management system for energy intensity. •The national energy intensity target cannot be fully disaggregated without omissions. •Data distortion is due to failures in statistics, monitoring and examination system. •Lower-level governments’ ability to meet energy target is weaker than their pressure. •We provide three policy recommendations for China's policy-makers.

  11. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  12. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  13. Patient set-up verification by infrared optical localization and body surface sensing in breast radiation therapy

    International Nuclear Information System (INIS)

    Spadea, Maria Francesca; Baroni, Guido; Riboldi, Marco; Orecchia, Roberto; Pedotti, Antonio; Tagaste, Barbara; Garibaldi, Cristina

    2006-01-01

    Background and purpose: The aim of the study was to investigate the clinical application of a technique for patient set-up verification in breast cancer radiotherapy, based on the 3D localization of a hybrid configuration of surface control points. Materials and methods: An infrared optical tracker provided the 3D position of two passive markers and 10 laser spots placed around and within the irradiation field on nine patients. A fast iterative constrained minimization procedure was applied to detect and compensate patient set-up errors, through the control points registration with reference data coming from treatment plan (markers reference position, CT-based surface model). Results: The application of the corrective spatial transformation estimated by the registration procedure led to significant improvement of patient set-up. Median value of 3D errors affecting three additional verification markers within the irradiation field decreased from 5.7 to 3.5 mm. Errors variability (25-75%) decreased from 3.2 to 2.1 mm. Laser spots registration on the reference surface model was documented to contribute substantially to set-up errors compensation. Conclusions: Patient set-up verification through a hybrid set of control points and constrained surface minimization algorithm was confirmed to be feasible in clinical practice and to provide valuable information for the improvement of the quality of patient set-up, with minimal requirement of operator-dependant procedures. The technique combines conveniently the advantages of passive markers based methods and surface registration techniques, by featuring immediate and robust estimation of the set-up accuracy from a redundant dataset

  14. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  15. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    Science.gov (United States)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict

  16. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  17. Natural radioactivity in the Dutch outdoor environment. The explanation of uncomprehended variations in the background

    International Nuclear Information System (INIS)

    Blaauboer, R.O.; Smetsers, R.C.G.M.

    1996-01-01

    In the Netherlands and many other countries research in the field of natural radioactivity is focused on the prevention of radon in the indoor environment. However, also the occurrence of natural radioactivity in the outdoor environment is an interesting subject to be studied. The natural background radiation in the outdoor environment, in particular its variations, hinders the verification of radiation level standards, caused by human activities. An analysis of the data of the Dutch National Monitoring Network for Radioactivity (LMR) provided more insight into those variations. This article is a summary of the authors' thesis on the subject. 5 figs., 8 refs

  18. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  19. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S PETROTAC

    Science.gov (United States)

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S TECHSUPPRESS

    Science.gov (United States)

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  2. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  3. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  4. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    exposed to known doses, and a high speed, 300 dot/'' scanner driven by photoshop software. Film data were analyzed with NIH image software. Absolute dose verification was achieved with TLD in the anthropomorphic phantom and diodes and ion chambers in calibration phantom slabs. Phantom setup closely simulated the patient's CT and treatment setups. Interslab spaces for films and phantom position were chosen to best sample conformity of tumor prescription dose, and compliance of maximum measured dose in normal tissues to doses entered as constraints. Verifications applied to commission the system consisted of annealing the cost function for simulated targets in the anthropomorphic phantom, and then comparing planned with measured doses. Subsequently a 'hybrid' verification was performed where the beam set obtained from patient geometry was detached from patient anatomic files and applied to calculate doses in the phantoms, followed by a comparison of measured with planned doses. Phantom slabs and positions were carefully selected to obtain an average TMR to the gantry isocenter in the phantom within 2% of the average within the patient. In vivo dosimetry was obtained with TLD under 1 cm of bolus at the location of the maximum skin surface dose. Plans were reoptimized including the contour of the added bolus to improve the accuracy of the measurement. The average leakage dose was assumed to be 0.4% of the total monitor units of the treatment. Results: Verification of planned dose distributions simulated in phantom indicate agreement of planned with measured doses of ±5% throughout numerous transverse plane films of 18 of 21 patients. In three patients with unusually large and complex shaped tumors, planned monitor units were altered to compensate for verifications indicating up to 10% differences between planned and measured doses. TLD in the phantom indicated improved agreement of absolute dose of ±5%. However, the accuracy of initial 'hybrid' verifications of patient

  5. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  6. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  7. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  8. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  9. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  10. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    Science.gov (United States)

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  11. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  12. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  13. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  14. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  15. Detailed numerical simulations of laser cooling processes

    Science.gov (United States)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  16. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. Numerical investigation of high temperature synthesis gas premixed combustion via ANSYS Fluent

    Directory of Open Access Journals (Sweden)

    Pashchenko Dmitry

    2018-01-01

    Full Text Available A numerical model of the synthesis gas pre-mixed combustion is developed. The research was carried out via ANSYS Fluent software. Verification of the numerical results was carried out using experimental data. A visual comparison of the flame contours that obtained by the synthesis gas combustion for Re = 600; 800; 1000 was performed. A comparison of the wall temperature of the combustion chamber, obtained with the help of the developed model, with the results of a physical experiment was also presented. For all cases, good convergence of the results is observed. It is established that a change in the temperature of the syngas/air mixture at the inlet to the combustion chamber does not significantly affect the temperature of the combustion products due to the dissipation of the H2O and CO2 molecules. The obtained results are of practical importance for the design of heat engineering plants with thermochemical heat recovery.

  18. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  19. Experimental verification of radial magnetic levitation force on the cylindrical magnets in ferrofluid dampers

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Wenming, E-mail: wenming_y@126.com [School of Mechanical Engineering, University of Science and Technology Beijing, Beijing 100083 (China); Wang, Pengkai [School of Mechanical Engineering, University of Science and Technology Beijing, Beijing 100083 (China); Hao, Ruican [School of Mechanical Engineering, Beijing Polytechnic, Beijing 100176 (China); Ma, Buchuan [Beijing Institute of Aerospace Control Devices, Beijing 100854 (China)

    2017-03-15

    Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0–2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers. - Highlights: • Experimental method measuring magnetic levitation force of ferrofluid was studied. • A simple but rather witty apparatus was designed and tailored. • The apparatus can measure forces in a range of 0–2.0 N with an accuracy of 0.001 N. • Existing methods calculating magnetic levitation force were verified experimentally.

  20. Experimental verification of radial magnetic levitation force on the cylindrical magnets in ferrofluid dampers

    International Nuclear Information System (INIS)

    Yang, Wenming; Wang, Pengkai; Hao, Ruican; Ma, Buchuan

    2017-01-01

    Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0–2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers. - Highlights: • Experimental method measuring magnetic levitation force of ferrofluid was studied. • A simple but rather witty apparatus was designed and tailored. • The apparatus can measure forces in a range of 0–2.0 N with an accuracy of 0.001 N. • Existing methods calculating magnetic levitation force were verified experimentally.

  1. Experimental verification of boundary conditions for numerical simulation of airflow in a benchmark ventilation channel

    Directory of Open Access Journals (Sweden)

    Lizal Frantisek

    2016-01-01

    Full Text Available Correct definition of boundary conditions is crucial for the appropriate simulation of a flow. It is a common practice that simulation of sufficiently long upstream entrance section is performed instead of experimental investigation of the actual conditions at the boundary of the examined area, in the case that the measurement is either impossible or extremely demanding. We focused on the case of a benchmark channel with ventilation outlet, which models a regular automotive ventilation system. At first, measurements of air velocity and turbulence intensity were performed at the boundary of the examined area, i.e. in the rectangular channel 272.5 mm upstream the ventilation outlet. Then, the experimentally acquired results were compared with results obtained by numerical simulation of further upstream entrance section defined according to generally approved theoretical suggestions. The comparison showed that despite the simple geometry and general agreement of average axial velocity, certain difference was found in the shape of the velocity profile. The difference was attributed to the simplifications of the numerical model and the isotropic turbulence assumption of the used turbulence model. The appropriate recommendations were stated for the future work.

  2. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  4. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  5. Numerical analysis of electromagnetic fields

    CERN Document Server

    Zhou Pei Bai

    1993-01-01

    Numerical methods for solving boundary value problems have developed rapidly. Knowledge of these methods is important both for engineers and scientists. There are many books published that deal with various approximate methods such as the finite element method, the boundary element method and so on. However, there is no textbook that includes all of these methods. This book is intended to fill this gap. The book is designed to be suitable for graduate students in engineering science, for senior undergraduate students as well as for scientists and engineers who are interested in electromagnetic fields. Objective Numerical calculation is the combination of mathematical methods and field theory. A great number of mathematical concepts, principles and techniques are discussed and many computational techniques are considered in dealing with practical problems. The purpose of this book is to provide students with a solid background in numerical analysis of the field problems. The book emphasizes the basic theories ...

  6. Entanglement verification and its applications in quantum communication; Verschraenkungsnachweise mit Anwendungen in der Quantenkommunikation

    Energy Technology Data Exchange (ETDEWEB)

    Haeseler, Hauke

    2010-02-16

    In this thesis, we investigate the uses of entanglement and its verification in quantum communication. The main object here is to develop a verification procedure which is adaptable to a wide range of applications, and whose implementation has low requirements on experimental resources. We present such a procedure in the form of the Expectation Value Matrix. The structure of this thesis is as follows: Chapters 1 and 2 give a short introduction and background information on quantum theory and the quantum states of light. In particular, we discuss the basic postulates of quantum mechanics, quantum state discrimination, the description of quantum light and the homodyne detector. Chapter 3 gives a brief introduction to quantum information and in particular to entanglement, and we discuss the basics of quantum key distribution and teleportation. The general framework of the Expectation Value Matrix is introduced. The main matter of this thesis is contained in the subsequent three chapters, which describe different quantum communication protocols and the corresponding adaptation of the entanglement verification method. The subject of Chapter 4 is quantum key distribution, where the detection of entanglement is a means of excluding intercept-resend attacks, and the presence of quantum correlations in the raw data is a necessary precondition for the generation of secret key. We investigate a continuous-variable version of the two-state protocol and develop the Expectation Value Matrix method for such qubit-mode systems. Furthermore, we analyse the role of the phase reference with respect to the security of the protocol and raise awareness of a corresponding security threat. For this, we adapt the verification method to different settings of Stokes operator measurements. In Chapter 5, we investigate quantum memory channels and propose a fundamental benchmark for these based on the verification of entanglement. After describing some physical effects which can be used for the

  7. BEval: A Plug-in to Extend Atelier B with Current Verification Technologies

    Directory of Open Access Journals (Sweden)

    Valério Medeiros Jr.

    2014-01-01

    Full Text Available This paper presents BEval, an extension of Atelier B to improve automation in the verification activities in the B method or Event-B. It combines a tool for managing and verifying software projects (Atelier B and a model checker/animator (ProB so that the verification conditions generated in the former are evaluated with the latter. In our experiments, the two main verification strategies (manual and automatic showed significant improvement as ProB's evaluator proves complementary to Atelier B built-in provers. We conducted experiments with the B model of a micro-controller instruction set; several verification conditions, that we were not able to discharge automatically or manually with AtelierB's provers, were automatically verified using BEval.

  8. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  9. The joint verification experiments as a global non-proliferation exercise

    International Nuclear Information System (INIS)

    Shaner, J.W.

    1998-01-01

    This conference commemorates the 10th anniversary of the second of two Joint Verification Experiments conducted by the Soviet Union and the US. These two experiments, one at the Nevada test site in the US, and the second here at the Semipalatinsk test site were designed to test the verification of a nuclear testing treaty limiting the size underground explosions to 150 kilotons. By building trust and technical respect between the weapons scientists of the two most powerful adversaries, the Joint Verification Experiment (JVE) had the unanticipated result of initiating a suite of cooperative projects and programs aimed at reducing the Cold War threats and preventing the proliferation of weapons of mass destruction

  10. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  11. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  12. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  13. A simple and rational numerical method of two-phase flow with volume-junction model. 2. The numerical method for general condition of two-phase flow in non-equilibrium states

    International Nuclear Information System (INIS)

    Okazaki, Motoaki

    1997-11-01

    In the previous report, the usefulness of a new numerical method to achieve a rigorous numerical calculation using a simple explicit method with the volume-junction model was presented with the verification calculation for the depressurization of a saturated two-phase mixture. In this report, on the basis of solution method above, a numerical method for general condition of two-phase flow in non-equilibrium states is presented. In general condition of two-phase flow, the combinations of saturated and non-saturated conditions of each phase are considered in the each flow of volume and junction. Numerical evaluation programs are separately prepared for each combination of flow condition. Several numerical calculations of various kinds of non-equilibrium two-phase flow are made to examine the validity of the numerical method. Calculated results showed that the thermodynamic states obtained in different solution schemes were consistent with each other. In the first scheme, the states are determined by using the steam table as a function of pressure and specific enthalpy which are obtained as the solutions of simultaneous equations. In the second scheme, density and specific enthalpy of each phase are directly calculated by using conservation equations of mass and enthalpy of each phase, respectively. Further, no accumulation of error in mass and energy was found. As for the specific enthalpy, two cases of using energy equations for the volume are examined. The first case uses total energy conservation equation and the second case uses the type of the first law of thermodynamics. The results of both cases agreed well. (author)

  14. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  15. Societal Verification: Intellectual Game or International Game-Changer

    International Nuclear Information System (INIS)

    Hartigan, Kelsey; Hinderstein, Corey

    2013-01-01

    Within the nuclear nonproliferation and arms control field, there is an increasing appreciation for the potential of open source information technologies to supplement existing verification and compliance regimes. While clearly not a substitute for on-site inspections or national technical means, it may be possible to better leverage information gleaned from commercial satellite imagery, international trade records and the vast amount of data being exchanged online and between publics (including social media) so as to develop a more comprehensive set of tools and practices for monitoring and verifying a state’s nuclear activities and helping judge compliance with international obligations. The next generation “toolkit” for monitoring and verifying items, facility operations and activities will likely include a more diverse set of analytical tools and technologies than are currently used internationally. To explore these and other issues, the Nuclear Threat Initiative has launched an effort that examines, in part, the role that emerging technologies and “citizen scientists” might play in future verification regimes. This paper will include an assessment of past proliferation and security “events” and whether emerging tools and technologies would have provided indicators concurrently or in advance of these actions. Such case studies will be instrumental in understanding the reliability of these technologies and practices and in thinking through the requirements of a 21st century verification regime. Keywords: Verification, social media, open-source information, arms control, disarmament.

  16. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  17. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  18. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  19. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivation...... compare the results with other state of the art Horn clause verification tools....

  20. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.