WorldWideScience

Sample records for background numerical verification

  1. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  2. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  3. RECOGNITION AND VERIFICATION OF TOUCHING HANDWRITTEN NUMERALS

    NARCIS (Netherlands)

    Zhou, J.; Kryzak, A.; Suen, C.Y.

    2004-01-01

    In the field of financial document processing, recognition of touching handwritten numerals has been limited by lack of good benchmarking databases and low reliability of algorithms. This paper addresses the efforts toward solving the two problems. Two databases IRIS-Bell\\\\\\'98 and TNIST are

  4. Numerical modeling of nitrogen oxide emission and experimental verification

    Directory of Open Access Journals (Sweden)

    Szecowka Lech

    2003-12-01

    Full Text Available The results of nitrogen reduction in combustion process with application of primary method are presented in paper. The reduction of NOx emission, by the recirculation of combustion gasses, staging of fuel and of air was investigated, and than the reduction of NOx emission by simultaneous usage of the mentioned above primary method with pulsatory disturbances.The investigations contain numerical modeling of NOx reduction and experimental verification of obtained numerical calculation results.

  5. Numerical Verification Methods for Spherical $t$-Designs

    OpenAIRE

    Chen, Xiaojun

    2009-01-01

    The construction of spherical $t$-designs with $(t+1)^2$ points on the unit sphere $S^2$ in $\\mathbb{R}^3$ can be reformulated as an underdetermined system of nonlinear equations. This system is highly nonlinear and involves the evaluation of a degree $t$ polynomial in $(t+1)^4$ arguments. This paper reviews numerical verification methods using the Brouwer fixed point theorem and Krawczyk interval operator for solutions of the underdetermined system of nonlinear equations...

  6. Incorporating Pass-Phrase Dependent Background Models for Text-Dependent Speaker verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2018-01-01

    -dependent. We show that the proposed method significantly reduces the error rates of text-dependent speaker verification for the non-target types: target-wrong and impostor-wrong while it maintains comparable TD-SV performance when impostors speak a correct utterance with respect to the conventional system......In this paper, we propose pass-phrase dependent background models (PBMs) for text-dependent (TD) speaker verification (SV) to integrate the pass-phrase identification process into the conventional TD-SV system, where a PBM is derived from a text-independent background model through adaptation using...... the utterances of a particular pass-phrase. During training, pass-phrase specific target speaker models are derived from the particular PBM using the training data for the respective target model. While testing, the best PBM is first selected for the test utterance in the maximum likelihood (ML) sense...

  7. Experimental verification of numerical calculations of railway passenger seats

    Science.gov (United States)

    Ligaj, B.; Wirwicki, M.; Karolewska, K.; Jasińska, A.

    2018-04-01

    The construction of railway seats is based on industry regulations and the requirements of end users, i.e. passengers. The two main documents in this context are the UIC 566 (3rd Edition, dated 7 January 1994) and the EN 12663-1: 2010+A1:2014. The study was to carry out static load tests of passenger seat frames. The paper presents the construction of the test bench and the results of experimental and numerical studies of passenger seat rail frames. The test bench consists of a frame, a transverse beam, two electric cylinders with a force value of 6 kN, and a strain gauge amplifier. It has a modular structure that allows for its expansion depending on the structure of the seats. Comparing experimental results with numerical results for points A and B allowed to determine the existing differences. It follows from it that higher stress values are obtained by numerical calculations in the range of 0.2 MPa to 35.9 MPa.

  8. Combining Task Execution and Background Knowledge for the Verification of Medical Guidelines

    Science.gov (United States)

    Hommersom, Arjen; Groot, Perry; Lucas, Peter; Balser, Michael; Schmitt, Jonathan

    The use of a medical guideline can be seen as the execution of computational tasks, sequentially or in parallel, in the face of patient data. It has been shown that many of such guidelines can be represented as a 'network of tasks', i.e., as a number of steps that have a specific function or goal. To investigate the quality of such guidelines we propose a formalization of criteria for good practice medicine a guideline should comply to. We use this theory in conjunction with medical background knowledge to verify the quality of a guideline dealing with diabetes mellitus type 2 using the interactive theorem prover KIV. Verification using task execution and background knowledge is a novel approach to quality checking of medical guidelines.

  9. Analytical and numerical verification of the Nernst theorem for metals

    International Nuclear Information System (INIS)

    Hoeye, Johan S.; Brevik, Iver; Ellingsen, Simen A.; Aarseth, Jan B.

    2007-01-01

    In view of the current discussion on the subject, an effort is made to show very accurately both analytically and numerically how the Drude dispersion model gives consistent results for the Casimir free energy at low temperatures. Specifically, for the free energy near T=0 we find the leading term proportional to T 2 and the next-to-leading term proportional to T 5-2 . These terms give rise to zero Casimir entropy as T→0 and are thus in accordance with Nernst's theorem

  10. Calibration and verification of numerical runoff and erosion model

    Directory of Open Access Journals (Sweden)

    Gabrić Ognjen

    2015-01-01

    Full Text Available Based on the field and laboratory measurements, and analogous with development of computational techniques, runoff and erosion models based on equations which describe the physics of the process are also developed. Based on the KINEROS2 model, this paper presents basic modelling principles of runoff and erosion processes based on the St. Venant's equations. Alternative equations for friction calculation, calculation of source and deposition elements and transport capacity are also shown. Numerical models based on original and alternative equations are calibrated and verified on laboratory scale model. According to the results, friction calculation based on the analytic solution of laminar flow must be included in all runoff and erosion models.

  11. Determination of Solution Accuracy of Numerical Schemes as Part of Code and Calculation Verification

    Energy Technology Data Exchange (ETDEWEB)

    Blottner, F.G.; Lopez, A.R.

    1998-10-01

    This investigation is concerned with the accuracy of numerical schemes for solving partial differential equations used in science and engineering simulation codes. Richardson extrapolation methods for steady and unsteady problems with structured meshes are presented as part of the verification procedure to determine code and calculation accuracy. The local truncation error de- termination of a numerical difference scheme is shown to be a significant component of the veri- fication procedure as it determines the consistency of the numerical scheme, the order of the numerical scheme, and the restrictions on the mesh variation with a non-uniform mesh. Genera- tion of a series of co-located, refined meshes with the appropriate variation of mesh cell size is in- vestigated and is another important component of the verification procedure. The importance of mesh refinement studies is shown to be more significant than just a procedure to determine solu- tion accuracy. It is suggested that mesh refinement techniques can be developed to determine con- sistency of numerical schemes and to determine if governing equations are well posed. The present investigation provides further insight into the conditions and procedures required to effec- tively use Richardson extrapolation with mesh refinement studies to achieve confidence that sim- ulation codes are producing accurate numerical solutions.

  12. Numerical method for IR background and clutter simulation

    Science.gov (United States)

    Quaranta, Carlo; Daniele, Gina; Balzarotti, Giorgio

    1997-06-01

    The paper describes a fast and accurate algorithm of IR background noise and clutter generation for application in scene simulations. The process is based on the hypothesis that background might be modeled as a statistical process where amplitude of signal obeys to the Gaussian distribution rule and zones of the same scene meet a correlation function with exponential form. The algorithm allows to provide an accurate mathematical approximation of the model and also an excellent fidelity with reality, that appears from a comparison with images from IR sensors. The proposed method shows advantages with respect to methods based on the filtering of white noise in time or frequency domain as it requires a limited number of computation and, furthermore, it is more accurate than the quasi random processes. The background generation starts from a reticule of few points and by means of growing rules the process is extended to the whole scene of required dimension and resolution. The statistical property of the model are properly maintained in the simulation process. The paper gives specific attention to the mathematical aspects of the algorithm and provides a number of simulations and comparisons with real scenes.

  13. Numerical verification of equilibrium chemistry software within nuclear fuel performance codes

    International Nuclear Information System (INIS)

    Piro, M.H.; Lewis, B.J.; Thompson, W.T.; Simunovic, S.; Besmann, T.M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing transport source terms, material properties, and boundary conditions in heat and mass transport modules. Consequently, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method called the Gibbs Criteria is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes. (author)

  14. Numerical verification of composite rods theory on multi-story buildings analysis

    Science.gov (United States)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  15. Numerical verification of the theory of coupled reactors for a deuterium critical assembly using MCNP5

    International Nuclear Information System (INIS)

    Hussein, M.S.; Bonin, H.W.; Lewis, B.J.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as Deuterium Critical Assembly, (DCA). The variations of the criticality factors and the coupling coefficients were investigated by changing of the water levels in the inner and outer cores. The numerical results of the model developed with MCNP5 code were validated and verified against published results and the mathematical model based on coupled reactor theory. (author)

  16. Numerical verification of the theory of coupled reactors for a deuterium critical assembly using MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S.; Bonin, H.W.; Lewis, B.J., E-mail: mohamed.hussein@rmc.ca, E-mail: bonin-h@rmc.ca, E-mail: lewis-b@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as Deuterium Critical Assembly, (DCA). The variations of the criticality factors and the coupling coefficients were investigated by changing of the water levels in the inner and outer cores. The numerical results of the model developed with MCNP5 code were validated and verified against published results and the mathematical model based on coupled reactor theory. (author)

  17. Numerical simulation code for combustion of sodium liquid droplet and its verification

    International Nuclear Information System (INIS)

    Okano, Yasushi

    1997-11-01

    The computer programs for sodium leak and burning phenomena had been developed based on mechanistic approach. Direct numerical simulation code for sodium liquid droplet burning had been developed for numerical analysis of droplet combustion in forced convection air flow. Distributions of heat generation and temperature and reaction rate of chemical productions, such as sodium oxide and hydroxide, are calculated and evaluated with using this numerical code. Extended MAC method coupled with a higher-order upwind scheme had been used for combustion simulation of methane-air mixture. In the numerical simulation code for combustion of sodium liquid droplet, chemical reaction model of sodium was connected with the extended MAC method. Combustion of single sodium liquid droplet was simulated in this report for the verification of developed numerical simulation code. The changes of burning rate and reaction product with droplet diameter and inlet wind velocity were investigated. These calculation results were qualitatively and quantitatively conformed to the experimental and calculation observations in combustion engineering. It was confirmed that the numerical simulation code was available for the calculation of sodium liquid droplet burning. (author)

  18. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    International Nuclear Information System (INIS)

    Looe, H.K.; Uphoff, Y.; Poppe, B.; Carl von Ossietzky Univ., Oldenburg; Harder, D.; Willborn, K.C.

    2012-01-01

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  19. Numerical deconvolution to enhance sharpness and contrast of portal images for radiotherapy patient positioning verification

    Energy Technology Data Exchange (ETDEWEB)

    Looe, H.K.; Uphoff, Y.; Poppe, B. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy; Carl von Ossietzky Univ., Oldenburg (Germany). WG Medical Radiation Physics; Harder, D. [Georg August Univ., Goettingen (Germany). Medical Physics and Biophysics; Willborn, K.C. [Pius Hospital, Oldenburg (Germany). Clinic for Radiation Therapy

    2012-02-15

    The quality of megavoltage clinical portal images is impaired by physical and geometrical effects. This image blurring can be corrected by a fast numerical two-dimensional (2D) deconvolution algorithm implemented in the electronic portal image device. We present some clinical examples of deconvolved portal images and evaluate the clinical advantages achieved by the improved sharpness and contrast. The principle of numerical 2D image deconvolution and the enhancement of sharpness and contrast thereby achieved are shortly explained. The key concept is the convolution kernel K(x,y), the mathematical equivalent of the smearing or blurring of a picture, and the computer-based elimination of this influence. Enhancements of sharpness and contrast were observed in all clinical portal images investigated. The images of fine bone structures were restored. The identification of organ boundaries and anatomical landmarks was improved, thereby permitting a more accurate comparison with the x-ray simulator radiographs. The visibility of prostate gold markers is also shown to be enhanced by deconvolution. The blurring effects of clinical portal images were eliminated by a numerical deconvolution algorithm that leads to better image sharpness and contrast. The fast algorithm permits the image blurring correction to be performed in real time, so that patient positioning verification with increased accuracy can be achieved in clinical practice. (orig.)

  20. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  1. Numerical simulation and experimental verification of gas flow through packed beds

    International Nuclear Information System (INIS)

    Natarajan, S.; Zhang, C.; Briens, C.

    2003-01-01

    This work is concerned with finding out an effective way of eliminating oxygen from a packed bed of monomer particles. This process finds application in industries involved in the manufacture of Nylon12. In the manufacture of the polymer Nylon12, the polymerization reaction is hindered by the presence of oxygen. Therefore, the main objective of this study is to get rid of the oxygen by injecting nitrogen to displace the oxygen from the voids in-between the monomer particles before they are introduced into the polymerization reactor. This work involves the numerical simulation and experimental verification of the flow in a packed bed. In addition, a parametric study is carried out for the parameters such as the number of injectors, the radial position of injectors, and the position of the injectors along the circumference of the packed bed to find out the best possible combination for effective elimination of the oxygen. Nitrogen does not interact with the monomer particles and hence there is no chemical reaction involved in this process. The nitrogen is introduced into the packed bed at a flow rate which will keep the superficial velocity well below the minimum fluidization velocity of the monomer particles. The packed bed will be modeled using a porous medium approach available in the commercial computational fluid dynamics (CFD) software FLUENT. The fluid flow inside the packed bed will be a multicomponent gas flow through a porous medium. The simulation results are validated by comparing with the experimental results. (author)

  2. Numerical modeling of plasma plume evolution against ambient background gas in laser blow off experiments

    International Nuclear Information System (INIS)

    Patel, Bhavesh G.; Das, Amita; Kaw, Predhiman; Singh, Rajesh; Kumar, Ajai

    2012-01-01

    Two dimensional numerical modelling based on simplified hydrodynamic evolution for an expanding plasma plume (created by laser blow off) against an ambient background gas has been carried out. A comparison with experimental observations shows that these simulations capture most features of the plasma plume expansion. The plume location and other gross features are reproduced as per the experimental observation in quantitative detail. The plume shape evolution and its dependence on the ambient background gas are in good qualitative agreement with the experiment. This suggests that a simplified hydrodynamic expansion model is adequate for the description of plasma plume expansion.

  3. Design and verification of controllers for longitudinal oscillations using optimal control theory and numerical simulation: Predictions for PEP-II

    International Nuclear Information System (INIS)

    Hindi, H.; Prabhakar, S.; Fox, J.; Teytelman, D.

    1997-12-01

    The authors present a technique for the design and verification of efficient bunch-by-bunch controllers for damping longitudinal multibunch instabilities. The controllers attempt to optimize the use of available feedback amplifier power--one of the most expensive components of a feedback system--and define the limits of closed loop system performance. The design technique alternates between analytic computation of single bunch optimal controllers and verification on a multibunch numerical simulator. The simulator identifies unstable coupled bunch modes and predicts their growth and damping rates. The results from the simulator are shown to be in reasonable agreement with analytical calculations based on the single bunch model. The technique is then used to evaluate the performance of a variety of controllers proposed for PEP-II

  4. Verification and validation of a numeric procedure for flow simulation of a 2x2 PWR rod bundle

    International Nuclear Information System (INIS)

    Santos, Andre A.C.; Barros Filho, Jose Afonso; Navarro, Moyses A.

    2011-01-01

    Before Computational Fluid Dynamics (CFD) can be considered as a reliable tool for the analysis of flow through rod bundles there is a need to establish the credibility of the numerical results. Procedures must be defined to evaluate the error and uncertainty due to aspects such as mesh refinement, turbulence model, wall treatment and appropriate definition of boundary conditions. These procedures are referred to as Verification and Validation (V and V) processes. In 2009 a standard was published by the American Society of Mechanical Engineers (ASME) establishing detailed procedures for V and V of CFD simulations. This paper presents a V and V evaluation of a numerical methodology applied to the simulation of a PWR rod bundle segment with a split vane spacer grid based on ASMEs standard. In this study six progressively refined meshes were generated to evaluate the numerical uncertainty through the verification procedure. Experimental and analytical results available in the literature were used in this study for validation purpose. The results show that the ASME verification procedure can give highly variable predictions of uncertainty depending on the mesh triplet used for the evaluation. However, the procedure can give good insight towards optimization of the mesh size and overall result quality. Although the experimental results used for the validation were not ideal, through the validation procedure the deficiencies and strengths of the presented modeling could be detected and reasonably evaluated. Even though it is difficult to obtain reliable estimates of the uncertainty of flow quantities in the turbulent flow, this study shows that the V and V process is a necessary step in a CFD analysis of a spacer grid design. (author)

  5. Fuzzy Verification of Lower Dimensional Information in a Numerical Simulation of Sea Ice

    Science.gov (United States)

    Sulsky, D.; Levy, G.

    2010-12-01

    Ideally, a verification and validation scheme should be able to evaluate and incorporate lower dimensional features (e.g., discontinuities) contained within a bulk simulation even when not directly observed or represented by model variables. Nonetheless, lower dimensional features are often ignored. Conversely, models that resolve such features and the associated physics well, yet imprecisely are penalized by traditional validation schemes. This can lead to (perceived or real) poor model performance and predictability and can become deleterious in model improvements when observations are sparse, fuzzy, or irregular. We present novel algorithms and a general framework for using information from available satellite data through fuzzy verification that efficiently and effectively remedy the known problems mentioned above. As a proof of concept, we use a sea-ice model with remotely sensed observations of leads in a one-step initialization cycle. Using the new scheme in a sixteen day simulation experiment introduces model skill (against persistence) several days earlier than in the control run, improves the overall model skill and delays its drop off at later stages of the simulation. Although sea-ice models are currently a weak link in climate models, the appropriate choice of data to use, and the fuzzy verification and evaluation of a system’s skill in reproducing lower dimensional features are important beyond the initial application to sea ice. Our strategy and framework for fuzzy verification, selective use of information, and feature extraction could be extended globally and to other disciplines. It can be incorporated in and complement existing verification and validation schemes, increasing their computational efficiency and the information they use. It can be used for model development and improvements, upscaling/downscaling models, and for modeling processes not directly represented by model variables or direct observations. Finally, if successful, it can

  6. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    International Nuclear Information System (INIS)

    Hussein, M.S; Lewis, B.J.; Bonin, H.W.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k eff calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k eff calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k eff calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  7. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S, E-mail: mohamed.hussein@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada); Lewis, B.J., E-mail: Brent.Lewis@uoit.ca [Univ. of Ontario Inst. of Technology, Faculty of Energy Systems and Nuclear Science, Oshawa, Ontario (Canada); Bonin, H.W., E-mail: bonin-h@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k{sub eff} calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k{sub eff} calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k{sub eff} calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  8. Numerical simulation and experimental verification of microstructure evolution in large forged pipe used for AP1000 nuclear power plants

    International Nuclear Information System (INIS)

    Wang, Shenglong; Yang, Bin; Zhang, Mingxian; Wu, Huanchun; Peng, Jintao; Gao, Yang

    2016-01-01

    Highlights: • Establish systematically the database of 316LN stainless steel for Deform-3D. • Simulate the microstructure evolution during forging of AP1000 primary coolant pipe. • Carry out full-scale forging experiment for verification in engineering practice. • Get desirable grain size in simulation and experiment. • The variation trends of grain sizes in simulation and experiment are consistent. - Abstract: AP1000 primary coolant pipe is a large special-shaped forged pipe made of 316LN stainless steel. Due to the non-uniform temperature and deformation during its forging, coarse and fine grains usually coexist in the forged pipe, resulting in the heterogeneous microstructure and anisotropic performance. To investigate the microstructure evolution during the entire forging process, in the present research, the database of the 316LN stainless steel was established and a numerical simulation was performed. The results indicate that the middle body section of the forged pipe has an extremely uniform average grain size with the value smaller than 30 μm. The grain sizes in the ends of body sections were ranged from 30 μm to 60 μm. Boss sections have relatively homogeneous microstructure with the average grain size 30 μm to 44 μm. Furthermore, a full-scale hot forging was carried out for verification. Comparison of theoretical and experimental results showed good agreement and hence demonstrated the capabilities of the numerical simulation presented here. It is noteworthy that all grains in the workpiece were confirmed less than 180 μm, which meets the designer’s demands.

  9. Doppler Radar and Analysis for Climate Model Verification and Numerical Weather Prediction

    National Research Council Canada - National Science Library

    Xu, Qin

    1998-01-01

    ... (Qiu and Xu, 1996, Mon. Wea. Rev., 1132-1144). The LS method was further upgraded to including background wind fields and used to improve the initial condition for the ARPS model's short-term prediction...

  10. Elasto-plastic benchmark calculations. Step 1: verification of the numerical accuracy of the computer programs

    International Nuclear Information System (INIS)

    Corsi, F.

    1985-01-01

    In connection with the design of nuclear reactors components operating at elevated temperature, design criteria need a level of realism in the prediction of inelastic structural behaviour. This concept leads to the necessity of developing non linear computer programmes, and, as a consequence, to the problems of verification and qualification of these tools. Benchmark calculations allow to carry out these two actions, involving at the same time an increased level of confidence in complex phenomena analysis and in inelastic design calculations. With the financial and programmatic support of the Commission of the European Communities (CEE) a programme of elasto-plastic benchmark calculations relevant to the design of structural components for LMFBR has been undertaken by those Member States which are developing a fast reactor project. Four principal progressive aims were initially pointed out that brought to the decision to subdivide the Benchmark effort in a calculations series of four sequential steps: step 1 to 4. The present document tries to summarize Step 1 of the Benchmark exercise, to derive some conclusions on Step 1 by comparison of the results obtained with the various codes and to point out some concluding comments on the first action. It is to point out that even if the work was designed to test the capabilities of the computer codes, another aim was to increase the skill of the users concerned

  11. Verification of a novel innovative blade root design for wind turbines using a hybrid numerical method

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2017-01-01

    captured at the outer part of the blades, where the relative wind speed is high. To assess the impact of this novel design idea, a hybrid numerical technique, based on solving the Reynolds-averaged Navier-Stokes equations, is utilized to determine the aerodynamic performance. The in-house developed Ellip...

  12. Numerical Analysis and Experimental Verification of Stresses Building up in Microelectronics Packaging

    NARCIS (Netherlands)

    Rezaie Adli, A.R.

    2017-01-01

    This thesis comprises a thorough study of the microelectronics packaging process by means of various experimental and numerical methods to estimate the process induced residual stresses. The main objective of the packaging is to encapsulate the die, interconnections and the other exposed internal

  13. Numerical model for verification of constitutive laws of blood vessel wall

    Czech Academy of Sciences Publication Activity Database

    Macková, H.; Chlup, Hynek; Žitný, R.

    -, 2/1 (2007), s. 66-66 ISSN 1880-9863 Institutional research plan: CEZ:AV0Z20760514 Keywords : constitutive law * numerical model * pulse wave velocity Subject RIV: BK - Fluid Dynamics http://www.jstage.jst.go.jp/browse/jbse/2/Suppl.1/_contents

  14. Study of natural convection heat transfer characteristics. (2) Verification for numerical simulation

    International Nuclear Information System (INIS)

    Ikeda, Hiroshi; Nakada, Kotaro; Ikeda, Tatsumi; Wakamatsu, Mitsuo; Iwaki, Chikako; Morooka, Shinichi; Masaki, Yoshikazu

    2008-01-01

    In the natural cooling system for waste storage, it is important to evaluate the flow by natural draft enough to remove the decay heat from the waste. In this study, we carried out the fundamental study of natural convection on vertical cylindrical heater by experiment and numerical simulation. The dimension of test facility is about 4m heights with single heater. Heating power is varied in the range of 33-110W, where Rayleigh number is over 10 10 . We surveyed the velocity distribution around heater by some turbulent models, mesh sizes around heated wall and turbulent Prandtl numbers. Results of numerical simulation of the velocity distribution and averaged heat transfer coefficient agreed well with experimental data and references. (author)

  15. Chemical transport in a fissured rock: verification of a numerical model

    International Nuclear Information System (INIS)

    Rasmuson, A.; Narasimham, T.N.; Neretnieks.

    1982-01-01

    Due to the very long-term, high toxicity of some nuclear waste products, models are required to predict, in certain cases, the spatial and temporal distribution of chemical concentration less than 0.001% of the concentration released from the repository. A numerical model, TRUMP, which solves the advective diffusion equation in general three dimensions, with or without decay and source term has been verified. The method is based on an integrated finite difference approach. The studies show that as long as the magnitude of advectance is equal to or less than that of conductance for the closed surface bonding any volume element in the region (that is, numerical Peclet number -3 % or less. The realistic input parameters used in the sample calculations suggest that such a range of Peclet numbers is indeed likely to characterize deep groundwater systems in granitic and ancient argillaceous systems. A sensitivity analysis based on the errors in prediction introduced due to uncertainties in input parameters are likely to be larger than the computational inaccuracies introduced by the numerical model. Currently, a disadvantage in the TRUMP model is that the iterative method of solving the set of simultaneous equations is rather slow when time constants vary widely over the flow region. Although the iterative solution may be very desirable for large three-dimensional problems in order to minimize computer storage, it seems desirable to use a direct solver technique in conjunction with the mixed explicit-implicit approach whenever possible. Work in this direction is in progress

  16. Numerical simulation of turbulent flow and heat transfer in a parallel channel. Verification of the field synergy principle

    International Nuclear Information System (INIS)

    Tian Wenxi; Su, G.H.; Qiu Suizheng; Jia Dounan

    2004-01-01

    The field synergy principle was proposed by Guo(1998) which is based on 2-D boundary laminar flow and it resulted from a second look at the mechanism of convective heat transfer. Numerical verification of this principle's validity for turbulent flow has been carried out by very few researchers, and mostly commercial software such as FLUENT, CFX etc. were used in their study. In this paper, numerical simulation of turbulent flow with recirculation was developed using SIMPLE algorithm with two-equation k-ε model. Extension of computational region method and wall function method were quoted to regulate the whole computational region geometrically. Given the inlet Reynold number keeps constant: 10000, by changing the height of the solid obstacle, simulation was conducted and the result showed that the wall heat flux decreased with the angle between the velocity vector and the temperature gradient. Thus it is validated that the field synergy principle based on 2-D boundary laminar flow can also be applied to complex turbulent flow even with recirculation. (author)

  17. LHCb: Numerical Analysis of Machine Background in the LHCb Experiment for the Early and Nominal Operation of LHC

    CERN Multimedia

    Lieng, M H; Corti, G; Talanov, V

    2010-01-01

    We consider the formation of machine background induced by proton losses in the long straight section of the LHCb experiment at LHC. Both sources showering from the tertiary collimators located in the LHCb insertion region as well as local beam-gas interaction are taken into account. We present the procedure for, and results of, numerical studies of such background for various conditions. Additionally expected impact and on the experiment and signal characteristics are discussed.

  18. Experimental verification of boundary conditions for numerical simulation of airflow in a benchmark ventilation channel

    Directory of Open Access Journals (Sweden)

    Lizal Frantisek

    2016-01-01

    Full Text Available Correct definition of boundary conditions is crucial for the appropriate simulation of a flow. It is a common practice that simulation of sufficiently long upstream entrance section is performed instead of experimental investigation of the actual conditions at the boundary of the examined area, in the case that the measurement is either impossible or extremely demanding. We focused on the case of a benchmark channel with ventilation outlet, which models a regular automotive ventilation system. At first, measurements of air velocity and turbulence intensity were performed at the boundary of the examined area, i.e. in the rectangular channel 272.5 mm upstream the ventilation outlet. Then, the experimentally acquired results were compared with results obtained by numerical simulation of further upstream entrance section defined according to generally approved theoretical suggestions. The comparison showed that despite the simple geometry and general agreement of average axial velocity, certain difference was found in the shape of the velocity profile. The difference was attributed to the simplifications of the numerical model and the isotropic turbulence assumption of the used turbulence model. The appropriate recommendations were stated for the future work.

  19. The straightforward numerical treatment of the time dependent advection in air pollution problems and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Hinrichsen, K

    1982-01-01

    A very simple Lagrangian finite difference scheme has been developed to calculate the time dependent advection of air pollutants. It is mass conserving and avoids numerical pseudo-diffusion. No condition of numerical stability is required. The Eulerian grid used for the diffusion part of the pollutant transport equation remains unchanged. There are no restrictions on temporally and spatially variable emission rates, production and destruction processes, wind velocity, diffusion coefficients, roughness parameters or inversion heights. The only exception is that the wind field should not be too far from being homogeneous in the horizontal direction (test of D. W. Pepper and P. E. Long, 1978, J. appl. Met. 17, 228-233). Steady state solutions are nearly identical with corresponding analytical solutions. The propagation of a pollutant cloud is simulated more realistically as compared with the advection treatment of E. Runca and F. Sardei (1975, Atmospheric Environment 9, 69-80) and M. Dunst (1980, Z. Met. 30, 47-59). The course of a diffusion experiment is modelled to demonstrate the efficiency of the proposed method. Because of its simplicity, the method is especially suited for use in license processes, for control, and for calculating health risks in relation to industrial and power plant accidents with the goal of organizing efficient protection or evacuation.

  20. Thermal dynamic behavior during selective laser melting of K418 superalloy: numerical simulation and experimental verification

    Science.gov (United States)

    Chen, Zhen; Xiang, Yu; Wei, Zhengying; Wei, Pei; Lu, Bingheng; Zhang, Lijuan; Du, Jun

    2018-04-01

    During selective laser melting (SLM) of K418 powder, the influence of the process parameters, such as laser power P and scanning speed v, on the dynamic thermal behavior and morphology of the melted tracks was investigated numerically. A 3D finite difference method was established to predict the dynamic thermal behavior and flow mechanism of K418 powder irradiated by a Gaussian laser beam. A three-dimensional randomly packed powder bed composed of spherical particles was established by discrete element method. The powder particle information including particle size distribution and packing density were taken into account. The volume shrinkage and temperature-dependent thermophysical parameters such as thermal conductivity, specific heat, and other physical properties were also considered. The volume of fluid method was applied to reconstruct the free surface of the molten pool during SLM. The geometrical features, continuity boundaries, and irregularities of the molten pool were proved to be largely determined by the laser energy density. The numerical results are in good agreement with the experiments, which prove to be reasonable and effective. The results provide us some in-depth insight into the complex physical behavior during SLM and guide the optimization of process parameters.

  1. Numerical simulation and experimental verification of a flat two-phase thermosyphon

    International Nuclear Information System (INIS)

    Zhang Ming; Liu Zhongliang; Ma Guoyuan; Cheng Shuiyuan

    2009-01-01

    The flat two-phase thermosyphon is placed between the heat source and the heat sink, which can achieve the uniform heat flux distribution and improve the performance of heat sink. In this paper, a two-dimensional heat and mass transfer model for a disk-shaped flat two-phase thermosyphon is developed. By solving the equations of continuity, momentum and energy numerically, the vapor velocity and temperature distributions of the flat two-phase thermosyphon are obtained. An analysis is also carried out on the ability of flat two-phase thermosyphon to spread heat and remove hot spots. In order to observe boiling and condensation phenomena, a transparent flat two-phase thermosyphon is manufactured and studied experimentally. The experimental results are compared with numerical results, which verify the physical and mathematical model of the flat two-phase thermosyphon. In order to study the main factors affecting the axial thermal resistance of two-phase thermosyphon, the temperatures inside the flat two-phase thermosyphon are measured and analyzed

  2. Analytical and Numerical Studies of the Complex Interaction of a Fast Ion Beam Pulse with a Background Plasma

    International Nuclear Information System (INIS)

    Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2003-01-01

    Plasma neutralization of an intense ion beam pulse is of interest for many applications, including plasma lenses, heavy ion fusion, high energy physics, etc. Comprehensive analytical, numerical, and experimental studies are underway to investigate the complex interaction of a fast ion beam with a background plasma. The positively charged ion beam attracts plasma electrons, and as a result the plasma electrons have a tendency to neutralize the beam charge and current. A suite of particle-in-cell codes has been developed to study the propagation of an ion beam pulse through the background plasma. For quasi-steady-state propagation of the ion beam pulse, an analytical theory has been developed using the assumption of long charge bunches and conservation of generalized vorticity. The analytical results agree well with the results of the numerical simulations. The visualization of the data obtained in the numerical simulations shows complex collective phenomena during beam entry into and ex it from the plasma

  3. Numerical verification of B-WIM system using reaction force signals

    International Nuclear Information System (INIS)

    Chang, Sung Jin; Kim, Nam Sik

    2012-01-01

    Bridges are ones of fundamental facilities for roads which become social overhead capital facilities and they are designed to get safety in their life cycles. However as time passes, bridge can be damaged by changes of external force and traffic environments. Therefore, a bridge should be repaired and maintained for extending its life cycle. The working load on a bridge is one of the most important factors for safety, it should be calculated accurately. The most important load among working loads is live load by a vehicle. Thus, the travel characteristics and weight of vehicle can be useful for bridge maintenance if they were estimated with high reliability. In this study, a B-WIM system in which the bridge is used for a scale have been developed for measuring the vehicle loads without the vehicle stop. The vehicle loads can be estimated by the developed B-WIM system with the reaction responses from the supporting points. The algorithm of developed B-WIM system have been verified by numerical analysis

  4. Verification of some numerical models for operationally predicting mesoscale winds aloft

    International Nuclear Information System (INIS)

    Cornett, J.S.; Randerson, D.

    1977-01-01

    Four numerical models are described for predicting mesoscale winds aloft for a 6 h period. These models are all tested statistically against persistence as the control forecast and against predictions made by operational forecasters. Mesoscale winds aloft data were used to initialize the models and to verify the predictions on an hourly basis. The model yielding the smallest root-mean-square vector errors (RMSVE's) was the one based on the most physics which included advection, ageostrophic acceleration, vertical mixing and friction. Horizontal advection was found to be the most important term in reducing the RMSVE's followed by ageostrophic acceleration, vertical advection, surface friction and vertical mixing. From a comparison of the mean absolute errors based on up to 72 independent wind-profile predictions made by operational forecasters, by the most complete model, and by persistence, we conclude that the model is the best wind predictor in the free air. In the boundary layer, the results tend to favor the forecaster for direction predictions. The speed predictions showed no overall superiority in any of these three models

  5. Study of the flow field past dimpled aerodynamic surfaces: numerical simulation and experimental verification

    Science.gov (United States)

    Binci, L.; Clementi, G.; D'Alessandro, V.; Montelpare, S.; Ricci, R.

    2017-11-01

    This work presents the study of the flow field past of dimpled laminar airfoil. Fluid dynamic behaviour of these elements has been not still deeply studied in the scientific community. Therefore Computational Fluid-Dynamics (CFD) is here used to analyze the flow field induced by dimples on the NACA 64-014A laminar airfoil at Re = 1.75 · 105 at α = 0°. Reynolds Averaged Navier-Stokes (RANS) equations and Large-Eddy Simulations (LES) were compared with wind tunnel measurements in order to evaluate their effectiveness in the modeling this kind of flow field. LES equations were solved using a specifically developed OpenFOAM solver adopting an L-stable Singly Diagonally Implicit Runge-Kutta (SDIRK) technique with an iterated PISO-like procedure for handling pressure-velocity coupling within each RK stage. Dynamic Smagorinsky subgrid model was employed. LES results provided good agreement with experimental data, while RANS equations closed with \\[k-ω -γ -\\overset{}{\\mathop{{{\\operatorname{Re}}θ, \\text{t}}}} \\] approach overstimates laminar separation bubble (LSB) extension of dimpled and un-dimpled configurations. Moreover, through skin friction coefficient analysis, we found a different representation of the turbulent zone between the numerical models; indeed, with RANS model LSB seems to be divided in two different parts, meanwhile LES model shows a LSB global reduction.

  6. Random distribution of background charge density for numerical simulation of discharge inception

    International Nuclear Information System (INIS)

    Grange, F.; Loiseau, J.F.; Spyrou, N.

    1998-01-01

    The models of electric streamers based on a uniform background density of electrons may appear not to be physical, as the number of electrons in the small active region located in the vicinity of the electrode tip under regular conditions can be less than one. To avoid this, the electron background is modelled by a random density distribution such that, after a certain time lag, at least one electron is present in the grid close to the point electrode. The modelling performed shows that the streamer inception is not very sensitive to the initial location of the charged particles; the ionizing front, however, may be delayed by several tens of nanoseconds, depending on the way the electron has to drift before reaching the anode. (J.U.)

  7. Lightning initiation mechanism based on the development of relativistic runaway electron avalanches triggered by background cosmic radiation: Numerical simulation

    International Nuclear Information System (INIS)

    Babich, L. P.; Bochkov, E. I.; Kutsyk, I. M.

    2011-01-01

    The mechanism of lightning initiation due to electric field enhancement by the polarization of a conducting channel produced by relativistic runaway electron avalanches triggered by background cosmic radiation has been simulated numerically. It is shown that the fields at which the start of a lightning leader is possible even in the absence of precipitations are locally realized for realistic thundercloud configurations and charges. The computational results agree with the in-situ observations of penetrating radiation enhancement in thunderclouds.

  8. Lightning initiation mechanism based on the development of relativistic runaway electron avalanches triggered by background cosmic radiation: Numerical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Babich, L. P., E-mail: babich@elph.vniief.ru; Bochkov, E. I.; Kutsyk, I. M. [All-Russian Research Institute of Experimental Physics, Russian Federal Nuclear Center (Russian Federation)

    2011-05-15

    The mechanism of lightning initiation due to electric field enhancement by the polarization of a conducting channel produced by relativistic runaway electron avalanches triggered by background cosmic radiation has been simulated numerically. It is shown that the fields at which the start of a lightning leader is possible even in the absence of precipitations are locally realized for realistic thundercloud configurations and charges. The computational results agree with the in-situ observations of penetrating radiation enhancement in thunderclouds.

  9. On the use of advanced numerical models for the evaluation of dosimetric parameters and the verification of exposure limits at workplaces.

    Science.gov (United States)

    Catarinucci, L; Tarricone, L

    2009-12-01

    With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory.

  10. On the use of advanced numerical models for the evaluation of dosimetric parameters and the verification of exposure limits at workplaces

    International Nuclear Information System (INIS)

    Catarinucci, L.; Tarricone, L.

    2009-01-01

    With the next transposition of the 2004/40/EC Directive, employers will become responsible for the electromagnetic field level at the workplace. To make this task easier, the scientific community is compiling practical guidelines to be followed. This work aims at enriching such guidelines, especially for the dosimetric issues. More specifically, some critical aspects related to the application of numerical dosimetric techniques for the verification of the safety limit compliance have been highlighted. In particular, three different aspects have been considered: the dosimetric parameter dependence on the shape and the inner characterisation of the exposed subject as well as on the numerical algorithm used, and the correlation between reference limits and basic restriction. Results and discussions demonstrate how, even by using sophisticated numerical techniques, in some cases a complex interpretation of the result is mandatory. (authors)

  11. Poster - 16: Time-resolved diode dosimetry for in vivo proton therapy range verification: calibration through numerical modeling

    Energy Technology Data Exchange (ETDEWEB)

    Toltz, Allison; Hoesl, Michaela; Schuemann, Jan; Seuntjens, Jan; Lu, Hsiao-Ming; Paganetti, Harald [McGill University, Harvard University, Massachusetts General Hospital, McGill University, Massachusetts General Hospital, Massachusetts General Hospital (United States)

    2016-08-15

    Purpose: A method to refine the implementation of an in vivo, adaptive proton therapy range verification methodology was investigated. Simulation experiments and in-phantom measurements were compared to validate the calibration procedure of a time-resolved diode dosimetry technique. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification by correlating properties of the detector signal to the water equivalent path length (WEPL). The implementation of this system requires a set of calibration measurements to establish a beam-specific diode response to WEPL fit for the selected ‘scout’ beam in a solid water phantom. This process is both tedious, as it necessitates a separate set of measurements for every ‘scout’ beam that may be appropriate to the clinical case, as well as inconvenient due to limited access to the clinical beamline. The diode response to WEPL relationship for a given ‘scout’ beam may be determined within a simulation environment, facilitating the applicability of this dosimetry technique. Measurements for three ‘scout’ beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). Results: Detector response in water equivalent plastic was successfully validated against simulation for spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) with adjusted R{sup 2} of 0.998. Conclusion: Feasibility has been shown for performing calibration of detector response for a given ‘scout’ beam through simulation for the time resolved diode dosimetry technique.

  12. Numerical climate modeling and verification of selected areas for heat waves of Pakistan using ensemble prediction system

    International Nuclear Information System (INIS)

    Amna, S; Samreen, N; Khalid, B; Shamim, A

    2013-01-01

    Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.

  13. On the kinematic criterion for the inception of breaking in surface gravity waves: Fully nonlinear numerical simulations and experimental verification

    Science.gov (United States)

    Khait, A.; Shemer, L.

    2018-05-01

    The evolution of unidirectional wave trains containing a wave that gradually becomes steep is evaluated experimentally and numerically using the Boundary Element Method (BEM). The boundary conditions for the nonlinear numerical simulations corresponded to the actual movements of the wavemaker paddle as recorded in the physical experiments, allowing direct comparison between the measured in experiments' characteristics of the wave train and the numerical predictions. The high level of qualitative and quantitative agreement between the measurements and simulations validated the kinematic criterion for the inception of breaking and the location of the spilling breaker, on the basis of the BEM computations and associated experiments. The breaking inception is associated with the fluid particle at the crest of the steep wave that has been accelerated to match and surpass the crest velocity. The previously observed significant slow-down of the crest while approaching breaking is verified numerically; both narrow-/broad-banded wave trains are considered. Finally, the relative importance of linear and nonlinear contributions is analyzed.

  14. International Benchmark on Numerical Simulations for 1D, Nonlinear Site Response (PRENOLIN) : Verification Phase Based on Canonical Cases

    NARCIS (Netherlands)

    Régnier, Julie; Bonilla, Luis-Fabian; Bard, Pierre-Yves; Bertrand, Etienne; Hollender, Fabrice; Kawase, Hiroshi; Sicilia, Deborah; Arduino, Pedro; Amorosi, Angelo; Asimaki, Dominiki; Pisano, F.

    2016-01-01

    PREdiction of NOn‐LINear soil behavior (PRENOLIN) is an international benchmark aiming to test multiple numerical simulation codes that are capable of predicting nonlinear seismic site response with various constitutive models. One of the objectives of this project is the assessment of the

  15. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  16. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  17. Generation of electrical energy using lead zirconate titanate (PZT-5A) piezoelectric material: Analytical, numerical and experimental verifications

    Energy Technology Data Exchange (ETDEWEB)

    Butt, Zubair; Ahmad, Nasir [Dept. of Mechanical, Mechatronics and Manufacturing Engineering, UET Lahore, Faisalabad Campus, Lahore (Pakistan); Pasha, Riffat Asim; Qayyum, Faisal; Anjum, Zeeshan [Dept. of Mechanical Engineering, University of Engineering and Technology, Taxila (Pakistan); Elahi, Hassan [Northwestern Polytechnical University, Xian (China)

    2016-08-15

    Energy harvesting is the process of attaining energy from the external sources and transforming it into usable electrical energy. An analytical model of piezoelectric energy harvester has been developed to determine the output voltage across an electrical circuit when it is forced to undergo a base excitation. This model gives an easy approach to design and investigate the behavior of piezoelectric material. Numerical simulations have been carried out to determine the effect of frequency and loading on a Lead zirconate titanate (PZT-5A) piezoelectric material. It has been observed that the output voltage from the harvester increases when loading increases whereas its resonance frequency decreases. The analytical results were found to be in good agreement with the experimental and numerical simulation results.

  18. Numerical simulation on temperature field of TIG welding for 0Cr18Ni10Ti steel cladding and experimental verification

    International Nuclear Information System (INIS)

    Luo Hongyi; Tang Xian; Luo Zhifu

    2015-01-01

    Aiming at tungsten inert gas (TIG) for 0Cr18Ni10Ti stainless steel cladding for radioactive source, the numerical calculation of welding pool temperature field was carried out through adopting ANSYS software. The numerical model of non-steady TIG welding pool shape was established, the heat enthalpy and Gaussian electric arc heat source model of surface distribution were introduced, and the effects of welding current and welding speed to temperature field distribution were calculated. Comparing the experimental data and the calculation results under different welding currents and speeds, the reliability and correctness of the model were proved. The welding technological parameters of 0Cr18Ni10Ti stainless steel were optimized based on the calculation results and the welding procedure was established. (authors)

  19. Numerical

    Directory of Open Access Journals (Sweden)

    M. Boumaza

    2015-07-01

    Full Text Available Transient convection heat transfer is of fundamental interest in many industrial and environmental situations, as well as in electronic devices and security of energy systems. Transient fluid flow problems are among the more difficult to analyze and yet are very often encountered in modern day technology. The main objective of this research project is to carry out a theoretical and numerical analysis of transient convective heat transfer in vertical flows, when the thermal field is due to different kinds of variation, in time and space of some boundary conditions, such as wall temperature or wall heat flux. This is achieved by the development of a mathematical model and its resolution by suitable numerical methods, as well as performing various sensitivity analyses. These objectives are achieved through a theoretical investigation of the effects of wall and fluid axial conduction, physical properties and heat capacity of the pipe wall on the transient downward mixed convection in a circular duct experiencing a sudden change in the applied heat flux on the outside surface of a central zone.

  20. Verification of the skill of numerical weather prediction models in forecasting rainfall from U.S. landfalling tropical cyclones

    Science.gov (United States)

    Luitel, Beda; Villarini, Gabriele; Vecchi, Gabriel A.

    2018-01-01

    The goal of this study is the evaluation of the skill of five state-of-the-art numerical weather prediction (NWP) systems [European Centre for Medium-Range Weather Forecasts (ECMWF), UK Met Office (UKMO), National Centers for Environmental Prediction (NCEP), China Meteorological Administration (CMA), and Canadian Meteorological Center (CMC)] in forecasting rainfall from North Atlantic tropical cyclones (TCs). Analyses focus on 15 North Atlantic TCs that made landfall along the U.S. coast over the 2007-2012 period. As reference data we use gridded rainfall provided by the Climate Prediction Center (CPC). We consider forecast lead-times up to five days. To benchmark the skill of these models, we consider rainfall estimates from one radar-based (Stage IV) and four satellite-based [Tropical Rainfall Measuring Mission - Multi-satellite Precipitation Analysis (TMPA, both real-time and research version); Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN); the CPC MORPHing Technique (CMORPH)] rainfall products. Daily and storm total rainfall fields from each of these remote sensing products are compared to the reference data to obtain information about the range of errors we can expect from "observational data." The skill of the NWP models is quantified: (1) by visual examination of the distribution of the errors in storm total rainfall for the different lead-times, and numerical examination of the first three moments of the error distribution; (2) relative to climatology at the daily scale. Considering these skill metrics, we conclude that the NWP models can provide skillful forecasts of TC rainfall with lead-times up to 48 h, without a consistently best or worst NWP model.

  1. Determination of ultra-short laser induced damage threshold of KH2PO4 crystal: Numerical calculation and experimental verification

    Directory of Open Access Journals (Sweden)

    Jian Cheng

    2016-03-01

    Full Text Available Rapid growth and ultra-precision machining of large-size KDP (KH2PO4 crystals with high laser damage resistance are tough challenges in the development of large laser systems. It is of high interest and practical significance to have theoretical models for scientists and manufacturers to determine the laser-induced damage threshold (LIDT of actually prepared KDP optics. Here, we numerically and experimentally investigate the laser-induced damage on KDP crystals in ultra-short pulse laser regime. On basis of the rate equation for free electron generation, a model dedicated to predicting the LIDT is developed by considering the synergistic effect of photoionization, impact ionization and decay of electrons. Laser damage tests are performed to measure the single-pulse LIDT with several testing protocols. The testing results combined with previously reported experimental data agree well with those calculated by the model. By taking the light intensification into consideration, the model is successfully applied to quantitatively evaluate the effect of surface flaws inevitably introduced in the preparation processes on the laser damage resistance of KDP crystals. This work can not only contribute to further understanding of the laser damage mechanisms of optical materials, but also provide available models for evaluating the laser damage resistance of exquisitely prepared optical components used in high power laser systems.

  2. Verification of Exciton Effects in Organic Solar Cells at Low Temperatures Based on a Modified Numerical Model

    Science.gov (United States)

    Xiong, Chun-Hua; Sun, Jiu-Xun; Wang, Dai-Peng; Dong, Yan

    2018-02-01

    There are many models for researching charge transport in semiconductors and improving their performance. Most of them give good descriptions of the experimental data at room temperature. But it is still an open question which model is correct. In this paper, numerical calculations based on three modified versions of a classical model were made, and compared with experimental data for typical devices at room or low temperatures. Although their results are very similar to each other at room temperatures, only the version considering exciton effects by using a hydrogen-like model can give qualitative descriptions to recent experimental data at low temperatures. Moreover, the mobility was researched in detail by comparing the constant model and temperature dependence model. Then, we found the performance increases with the mobility of each charge carrier type being independent to the mobility of the other one. This paper provides better insight into understanding the physical mechanism of carrier transport in semiconductors, and the results show that exciton effects should be considered in modeling organic solar cells.

  3. A contribution to the electron-beam surface-melting process of metallic materials. Numerical simulation and experimental verification

    International Nuclear Information System (INIS)

    Bruckner, A.

    1996-08-01

    For the optimization of the surface melting process it is necessary to make many different experiments. Therefore, the simulation of the surface melting process becomes a major role for the optimization. Most of the simulations, developed for the laser surface melting process, are not usable for the electron-beam surface melting process, because of the different energy input and the possibility of high frequent movement of the electron-beam. In this thesis, a calculation model for electron-beam surface melting is presented. For this numerical simulation a variable volume source is used, which moves in axial direction with the same velocity as the vapor cavity into the material. With this calculation model also the high frequent movement of the electron-beam may be taken into account. The electron-beam diameter is measured with a method of drilling holes with short electron-beam pulses in thin foils. The diameter of the holes depends on the pulse length and reaches a maximal value, which is used for the diameter of the volume source in the calculation. The crack-formation, seen in many treated surfaces, is examined with the Acoustic-Emission Testing. The possibilities of the electron-beam surface melting process are shown with some experiments for different requirements of the treated surfaces, like increasing the hardness, reducing the porosity of a sintered material and the alloying of tin in an aluminium-silicon surface. (author)

  4. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  5. Numerical investigation on the implications of spring temperature and discharge rate with respect to the geothermal background in a fault zone

    Science.gov (United States)

    Jiang, Zhenjiao; Xu, Tianfu; Mariethoz, Gregoire

    2018-04-01

    Geothermal springs are some of the most obvious indicators of the existence of high-temperature geothermal resources in the subsurface. However, geothermal springs can also occur in areas of low average subsurface temperatures, which makes it difficult to assess exploitable zones. To address this problem, this study quantitatively analyzes the conditions associated with the formation of geothermal springs in fault zones, and numerically investigates the implications that outflow temperature and discharge rate from geothermal springs have on the geothermal background in the subsurface. It is concluded that the temperature of geothermal springs in fault zones is mainly controlled by the recharge rate from the country rock and the hydraulic conductivity in the fault damage zone. Importantly, the topography of the fault trace on the land surface plays an important role in determining the thermal temperature. In fault zones with a permeability higher than 1 mD and a lateral recharge rate from the country rock higher than 1 m3/day, convection plays a dominant role in the heat transport rather than thermal conduction. The geothermal springs do not necessarily occur in the place having an abnormal geothermal background (with the temperature at certain depth exceeding the temperature inferred by the global average continental geothermal gradient of 30 °C/km). Assuming a constant temperature (90 °C here, to represent a normal geothermal background in the subsurface at a depth of 3,000 m), the conditions required for the occurrence of geothermal springs were quantitatively determined.

  6. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Development of synchronized control method for shaking table with booster device. Verification of the capabilities based on both real facility and numerical simulator

    International Nuclear Information System (INIS)

    Kajii, Shin-ichirou; Yasuda, Chiaki; Yamashita, Toshio; Abe, Hiroshi; Kanki, Hiroshi

    2004-01-01

    In the seismic design of nuclear power plant, it is recently considered to use probability method in a addition to certainty method. The former method is called Seismic Probability Safety Assessment (Seismic PSA). In case of seismic PSA for some components of a nuclear power plant using a shaking table, it is necessary for some limited conditions with high level of accelerations such as actual conditions. However, it might be difficult to achieve the test conditions that a current shaking table based on hydraulic power system is intended for the test facility. Therefore, we have been planning out a test method in which both a current and another shaking table called a booster device are applied. This paper describes the verification test of a synchronized control between a current shaking table and a booster device. (author)

  10. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  11. Numerical simulation of turbulent flow and heat transfer in parallel channel with an obstacle and verification of the field synergy principle

    Energy Technology Data Exchange (ETDEWEB)

    Tian, W.; Aye, M.; Qiu, S.; Jia, D. [Xi' an Jiaotong Univ., Dept. of Nuclear and Thermal Power Engineering, Xi' an (China)]. E-mail: wxtian_xjtu@163.com

    2004-07-01

    The field synergy principle was proposed by Guo Z. Y based on 2-D boundary laminar flow and it resulted from a second look at the mechanism of convective heat transfer. The objective of this paper is to numerically verify the applicability of this theory under turbulent flow or even with recirculating flow condition. (author)

  12. Verification of RELAP5/MOD3 with theoretical and numerical stability results on single-phase, natural circulation in a simple loop

    International Nuclear Information System (INIS)

    Ferreri, Juan C.; Ambrosini, Walter

    1998-01-01

    The theoretical results given by Pierre Welander are used to test the capability of the RELAP5 series of codes to predict instabilities in single-phase flow. These results are related to the natural circulation in a loop formed by two parallel adiabatic tubes with a point heat sink at the top and a point heat source at the bottom. A stability curve may be defined for laminar flow and was extended to consider turbulent flow. By a suitable selection of the ratio of the total buoyancy force in the loop to the friction resistance, the flow may show instabilities. The solution was useful to test two basic numerical properties of the RELAP5 code, namely: a) convergence to steady state flow-rate using a 'lumped parameter' approximation to both the heat source and sink and; b) the effect of nodalization to numerically damp the instabilities. It was shown that, using a single volume to lump the heat source and sink, it was not possible to reach convergence to steady state flow rate when the heated (cooled) length was diminished and the heat transfer coefficient increased to keep constant the total heat transferred to (and removed from) the fluid. An algebraic justification of these results is presented, showing that it is a limitation inherent to the numerical scheme adopted. Concerning the effect of nodalization on the damping of instabilities, it was shown that a 'reasonably fine' discretization led, as expected, to the damping of the solution. However, the search for convergence of numerical and theoretical results was successful, showing the expected nearly chaotic behavior. This search lead to very refined nodalization. The results obtained have also been verified by the use of simple, ad hoc codes. A procedure to assess the effects of nodalization on the prediction of instabilities threshold is outlined in this report. It is based on the experience gained with aforementioned simpler codes. (author)

  13. Development and Experimental Verification of the Numerical Simulation Method for the Quasi-Steady SWR Phenomena in an LMR Steam Generator

    International Nuclear Information System (INIS)

    Eoh, Jae-Hyuk; Jeong, Ji-Young; Kim, Seong-O; Hahn, Dohee; Park, Nam-Cook

    2005-01-01

    A quasi-steady system analysis of the sodium-water reaction (SWR) phenomena in a liquid-metal reactor (LMR) was performed using the Sodium-water reaction Event Later Phase System Transient Analyzer (SELPSTA) computer simulation code. The code has been formulated by implementing various physical assumptions to simplify the complex SWR phenomena, and it adopts the long-term mass and energy transfer (LMET) model developed in the present study. The LMET model is based on the hypothesis that the system transient can be described by the pressure and temperature transient of the cover gas space, and it can be applied only to the reaction period characterized by bulk motion. To evaluate the feasibility of the physical model and its assumptions, a scale-down mock-up test was carried out, and it was demonstrated that the numerical simulation using the LMET model adequately replicates the overall phenomena of the experiment with reasonable understanding. Based on the findings, as a numerical example, the long-term system transient responses during the SWR event of the Korea Advanced LIquid MEtal Reactor (KALIMER) were investigated, and it was found that the long-term dynamic responses are strongly dependent on the design parameters and operational strategies. As a result, the numerical simulation method developed in the present study is practicable; furthermore, the SELPSTA code is useful to resolve the risk for the SWR event

  14. Determination of ultra-short laser induced damage threshold of KH{sub 2}PO{sub 4} crystal: Numerical calculation and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jian [Center for Precision Engineering, School of Mechatronics Engineering, Harbin Institute of Technology, Harbin 150001 (China); Department of Physics, The Ohio State University, 191 W. Woodruff Ave, Columbus, OH 43210 (United States); Chen, Mingjun, E-mail: chenmj@hit.edu.cn, E-mail: chowdhury.24@osu.edu; Wang, Jinghe; Xiao, Yong [Center for Precision Engineering, School of Mechatronics Engineering, Harbin Institute of Technology, Harbin 150001 (China); Kafka, Kyle; Austin, Drake; Chowdhury, Enam, E-mail: chenmj@hit.edu.cn, E-mail: chowdhury.24@osu.edu [Department of Physics, The Ohio State University, 191 W. Woodruff Ave, Columbus, OH 43210 (United States)

    2016-03-15

    Rapid growth and ultra-precision machining of large-size KDP (KH{sub 2}PO{sub 4}) crystals with high laser damage resistance are tough challenges in the development of large laser systems. It is of high interest and practical significance to have theoretical models for scientists and manufacturers to determine the laser-induced damage threshold (LIDT) of actually prepared KDP optics. Here, we numerically and experimentally investigate the laser-induced damage on KDP crystals in ultra-short pulse laser regime. On basis of the rate equation for free electron generation, a model dedicated to predicting the LIDT is developed by considering the synergistic effect of photoionization, impact ionization and decay of electrons. Laser damage tests are performed to measure the single-pulse LIDT with several testing protocols. The testing results combined with previously reported experimental data agree well with those calculated by the model. By taking the light intensification into consideration, the model is successfully applied to quantitatively evaluate the effect of surface flaws inevitably introduced in the preparation processes on the laser damage resistance of KDP crystals. This work can not only contribute to further understanding of the laser damage mechanisms of optical materials, but also provide available models for evaluating the laser damage resistance of exquisitely prepared optical components used in high power laser systems.

  15. Numerical relativity

    CERN Document Server

    Shibata, Masaru

    2016-01-01

    This book is composed of two parts: First part describes basics in numerical relativity, that is, the formulations and methods for a solution of Einstein's equation and general relativistic matter field equations. This part will be helpful for beginners of numerical relativity who would like to understand the content of numerical relativity and its background. The second part focuses on the application of numerical relativity. A wide variety of scientific numerical results are introduced focusing in particular on the merger of binary neutron stars and black holes.

  16. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  17. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  18. Numerical implementation and oceanographic application of the thermodynamic potentials of liquid water, water vapour, ice, seawater and humid air – Part 1: Background and equations

    Directory of Open Access Journals (Sweden)

    R. Feistel

    2010-07-01

    Full Text Available A new seawater standard referred to as the International Thermodynamic Equation of Seawater 2010 (TEOS-10 was adopted in June 2009 by UNESCO/IOC on its 25th General Assembly in Paris, as recommended by the SCOR/IAPSO Working Group 127 (WG127 on Thermodynamics and Equation of State of Seawater. To support the adoption process, WG127 has developed a comprehensive source code library for the thermodynamic properties of liquid water, water vapour, ice, seawater and humid air, referred to as the Sea-Ice-Air (SIA library. Here we present the background information and equations required for the determination of the properties of single phases and components as well as of phase transitions and composite systems as implemented in the library. All results are based on rigorous mathematical methods applied to the Primary Standards of the constituents, formulated as empirical thermodynamic potential functions and, except for humid air, endorsed as Releases of the International Association for the Properties of Water and Steam (IAPWS. Details of the implementation in the TEOS-10 SIA library are given in a companion paper.

  19. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  20. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  1. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  2. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  3. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...

  4. Background Material

    DEFF Research Database (Denmark)

    Zandersen, Marianne; Hyytiäinen, Kari; Saraiva, Sofia

    This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders.......This document serves as a background material to the BONUS Pilot Scenario Workshop, which aims to develop harmonised regional storylines of socio-ecological futures in the Baltic Sea region in a collaborative effort together with other BONUS projects and stakeholders....

  5. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  6. Methods of numerical relativity

    International Nuclear Information System (INIS)

    Piran, T.

    1983-01-01

    Numerical Relativity is an alternative to analytical methods for obtaining solutions for Einstein equations. Numerical methods are particularly useful for studying generation of gravitational radiation by potential strong sources. The author reviews the analytical background, the numerical analysis aspects and techniques and some of the difficulties involved in numerical relativity. (Auth.)

  7. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  8. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  9. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  10. Background radiation

    International Nuclear Information System (INIS)

    Arnott, D.

    1985-01-01

    The effects of background radiation, whether natural or caused by man's activities, are discussed. The known biological effects of radiation in causing cancers or genetic mutations are explained. The statement that there is a threshold below which there is no risk is examined critically. (U.K.)

  11. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  12. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  13. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  14. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  15. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  16. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  17. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  18. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  19. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  20. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  1. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  2. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  3. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  4. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  5. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  6. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  7. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  8. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  9. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  10. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  11. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  12. Verification of thermo-fluidic CVD reactor model

    International Nuclear Information System (INIS)

    Lisik, Z; Turczynski, M; Ruta, L; Raj, E

    2014-01-01

    Presented paper describes the numerical model of CVD (Chemical Vapour Deposition) reactor created in ANSYS CFX, whose main purpose is the evaluation of numerical approaches used to modelling of heat and mass transfer inside the reactor chamber. Verification of the worked out CVD model has been conducted with measurements under various thermal, pressure and gas flow rate conditions. Good agreement between experimental and numerical results confirms correctness of the elaborated model.

  13. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  14. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  15. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  16. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  17. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  18. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  19. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  20. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  1. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  2. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  3. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  4. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  5. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  6. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  7. Numerical relativity

    International Nuclear Information System (INIS)

    Piran, T.

    1982-01-01

    There are many recent developments in numerical relativity, but there remain important unsolved theoretical and practical problems. The author reviews existing numerical approaches to solution of the exact Einstein equations. A framework for classification and comparison of different numerical schemes is presented. Recent numerical codes are compared using this framework. The discussion focuses on new developments and on currently open questions, excluding a review of numerical techniques. (Auth.)

  8. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  9. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  10. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  11. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  12. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  13. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  14. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  15. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  16. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  17. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  18. Numerical computations with GPUs

    CERN Document Server

    Kindratenko, Volodymyr

    2014-01-01

    This book brings together research on numerical methods adapted for Graphics Processing Units (GPUs). It explains recent efforts to adapt classic numerical methods, including solution of linear equations and FFT, for massively parallel GPU architectures. This volume consolidates recent research and adaptations, covering widely used methods that are at the core of many scientific and engineering computations. Each chapter is written by authors working on a specific group of methods; these leading experts provide mathematical background, parallel algorithms and implementation details leading to

  19. Numerical simulation of the environmental impact of hydraulic fracturing of tight/shale gas reservoirs on near-surface groundwater: Background, base cases, shallow reservoirs, short-term gas, and water transport

    Science.gov (United States)

    Reagan, Matthew T; Moridis, George J; Keen, Noel D; Johnson, Jeffrey N

    2015-01-01

    Hydrocarbon production from unconventional resources and the use of reservoir stimulation techniques, such as hydraulic fracturing, has grown explosively over the last decade. However, concerns have arisen that reservoir stimulation creates significant environmental threats through the creation of permeable pathways connecting the stimulated reservoir with shallower freshwater aquifers, thus resulting in the contamination of potable groundwater by escaping hydrocarbons or other reservoir fluids. This study investigates, by numerical simulation, gas and water transport between a shallow tight-gas reservoir and a shallower overlying freshwater aquifer following hydraulic fracturing operations, if such a connecting pathway has been created. We focus on two general failure scenarios: (1) communication between the reservoir and aquifer via a connecting fracture or fault and (2) communication via a deteriorated, preexisting nearby well. We conclude that the key factors driving short-term transport of gas include high permeability for the connecting pathway and the overall volume of the connecting feature. Production from the reservoir is likely to mitigate release through reduction of available free gas and lowering of reservoir pressure, and not producing may increase the potential for release. We also find that hydrostatic tight-gas reservoirs are unlikely to act as a continuing source of migrating gas, as gas contained within the newly formed hydraulic fracture is the primary source for potential contamination. Such incidents of gas escape are likely to be limited in duration and scope for hydrostatic reservoirs. Reliable field and laboratory data must be acquired to constrain the factors and determine the likelihood of these outcomes. Key Points: Short-term leakage fractured reservoirs requires high-permeability pathways Production strategy affects the likelihood and magnitude of gas release Gas release is likely short-term, without additional driving forces PMID

  20. Numerical analysis

    CERN Document Server

    Khabaza, I M

    1960-01-01

    Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput

  1. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  2. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  3. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  4. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  5. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  6. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, C. Chace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Niederhaus, John Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, Allen C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-29

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specialized approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.

  7. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  8. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    The determination of soil background is one of the most important activities supporting environmental restoration and waste management on the Hanford Site. Background compositions serve as the basis for identifying soil contamination, and also as a baseline in risk assessment processes used to determine soil cleanup and treatment levels. These uses of soil background require an understanding of the extent to which analytes of concern occur naturally in the soils. This report documents the results of sampling and analysis activities designed to characterize the composition of soil background at the Hanford Site, and to evaluate the feasibility for use as Sitewide background. The compositions of naturally occurring soils in the vadose Zone have been-determined for-nonradioactive inorganic and organic analytes and related physical properties. These results confirm that a Sitewide approach to the characterization of soil background is technically sound and is a viable alternative to the determination and use of numerous local or area backgrounds that yield inconsistent definitions of contamination. Sitewide soil background consists of several types of data and is appropriate for use in identifying contamination in all soils in the vadose zone on the Hanford Site. The natural concentrations of nearly every inorganic analyte extend to levels that exceed calculated health-based cleanup limits. The levels of most inorganic analytes, however, are well below these health-based limits. The highest measured background concentrations occur in three volumetrically minor soil types, the most important of which are topsoils adjacent to the Columbia River that are rich in organic carbon. No organic analyte levels above detection were found in any of the soil samples

  9. Testability of numerical systems

    International Nuclear Information System (INIS)

    Soulas, B.

    1992-01-01

    In order to face up to the growing complexity of systems, the authors undertook to define a new approach for the qualification of systems. This approach is based on the concept of Testability which, supported by system modelization, validation and verification methods and tools, would allow Integrated Qualification process, applied throughout the life-span of systems. The general principles of this approach are introduced in the general case of numerical systems; in particular, this presentation points out the difference between the specification activity and the modelization and validation activity. This approach is illustrated firstly by the study of a global system and then by case of communication protocol as the software point of view. Finally MODEL which support this approach is described. MODEL tool is a commercial tool providing modelization and validation techniques based on Petri Nets with triple extension: Predicate/Transition, Timed and Stochastic Petri Nets

  10. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  11. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  12. Playing Linear Numerical Board Games Promotes Low-Income Children's Numerical Development

    Science.gov (United States)

    Siegler, Robert S.; Ramani, Geetha B.

    2008-01-01

    The numerical knowledge of children from low-income backgrounds trails behind that of peers from middle-income backgrounds even before the children enter school. This gap may reflect differing prior experience with informal numerical activities, such as numerical board games. Experiment 1 indicated that the numerical magnitude knowledge of…

  13. Numerical Development

    Science.gov (United States)

    Siegler, Robert S.; Braithwaite, David W.

    2016-01-01

    In this review, we attempt to integrate two crucial aspects of numerical development: learning the magnitudes of individual numbers and learning arithmetic. Numerical magnitude development involves gaining increasingly precise knowledge of increasing ranges and types of numbers: from non-symbolic to small symbolic numbers, from smaller to larger…

  14. Hindi Numerals.

    Science.gov (United States)

    Bright, William

    In most languages encountered by linguists, the numerals, considered as a paradigmatic set, constitute a morpho-syntactic problem of only moderate complexity. The Indo-Aryan language family of North India, however, presents a curious contrast. The relatively regular numeral system of Sanskrit, as it has developed historically into the modern…

  15. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  16. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  17. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  18. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  19. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  20. Numerical semigroups and applications

    CERN Document Server

    Assi, Abdallah

    2016-01-01

    This work presents applications of numerical semigroups in Algebraic Geometry, Number Theory, and Coding Theory. Background on numerical semigroups is presented in the first two chapters, which introduce basic notation and fundamental concepts and irreducible numerical semigroups. The focus is in particular on free semigroups, which are irreducible; semigroups associated with planar curves are of this kind. The authors also introduce semigroups associated with irreducible meromorphic series, and show how these are used in order to present the properties of planar curves. Invariants of non-unique factorizations for numerical semigroups are also studied. These invariants are computationally accessible in this setting, and thus this monograph can be used as an introduction to Factorization Theory. Since factorizations and divisibility are strongly connected, the authors show some applications to AG Codes in the final section. The book will be of value for undergraduate students (especially those at a higher leve...

  1. Subsurface barrier verification technologies, informal report

    International Nuclear Information System (INIS)

    Heiser, J.H.

    1994-06-01

    One of the more promising remediation options available to the DOE waste management community is subsurface barriers. Some of the uses of subsurface barriers include surrounding and/or containing buried waste, as secondary confinement of underground storage tanks, to direct or contain subsurface contaminant plumes and to restrict remediation methods, such as vacuum extraction, to a limited area. To be most effective the barriers should be continuous and depending on use, have few or no breaches. A breach may be formed through numerous pathways including: discontinuous grout application, from joints between panels and from cracking due to grout curing or wet-dry cycling. The ability to verify barrier integrity is valuable to the DOE, EPA, and commercial sector and will be required to gain full public acceptance of subsurface barriers as either primary or secondary confinement at waste sites. It is recognized that no suitable method exists for the verification of an emplaced barrier's integrity. The large size and deep placement of subsurface barriers makes detection of leaks challenging. This becomes magnified if the permissible leakage from the site is low. Detection of small cracks (fractions of an inch) at depths of 100 feet or more has not been possible using existing surface geophysical techniques. Compounding the problem of locating flaws in a barrier is the fact that no placement technology can guarantee the completeness or integrity of the emplaced barrier. This report summarizes several commonly used or promising technologies that have been or may be applied to in-situ barrier continuity verification

  2. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  3. Background sources at PEP

    International Nuclear Information System (INIS)

    Lynch, H.; Schwitters, R.F.; Toner, W.T.

    1988-01-01

    Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, γ-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs

  4. Cosmic Microwave Background Timeline

    Science.gov (United States)

    Cosmic Microwave Background Timeline 1934 : Richard Tolman shows that blackbody radiation in an will have a blackbody cosmic microwave background with temperature about 5 K 1955: Tigran Shmaonov anisotropy in the cosmic microwave background, this strongly supports the big bang model with gravitational

  5. A bimodal verification cryptosystem as a framework against spoofing attacks

    OpenAIRE

    Toli, Christina-Angeliki; Preneel, Bart

    2015-01-01

    The exponential growth of immigration crisis and the recent terrorism cases revealed the increase of fraud occurrences, cloning and identity theft with numerous social, economic and political consequences. The trustworthiness of biometrics during verification processes has been compromised by spoofing attackers sprang up to exploit the security gaps. Additionally, the cryptography’s role in the area is highly important as it may promote fair assessment procedures and foster public trust by se...

  6. Verification of Monte Carlo transport codes by activation experiments

    OpenAIRE

    Chetvertkova, Vera

    2013-01-01

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is...

  7. Numerical Modeling of Ablation Heat Transfer

    Science.gov (United States)

    Ewing, Mark E.; Laker, Travis S.; Walker, David T.

    2013-01-01

    A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.

  8. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  9. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  10. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  11. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  12. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  13. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  14. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  15. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  16. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  17. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  18. Development of requirements tracking and verification technology for the NPP software

    International Nuclear Information System (INIS)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-01-01

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification

  19. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  20. Numerical Relativity

    Science.gov (United States)

    Baker, John G.

    2009-01-01

    Recent advances in numerical relativity have fueled an explosion of progress in understanding the predictions of Einstein's theory of gravity, General Relativity, for the strong field dynamics, the gravitational radiation wave forms, and consequently the state of the remnant produced from the merger of compact binary objects. I will review recent results from the field, focusing on mergers of two black holes.

  1. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  2. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  3. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  4. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  5. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  6. Verification of the MOTIF code version 3.0

    International Nuclear Information System (INIS)

    Chan, T.; Guvanasen, V.; Nakka, B.W.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.

    1996-12-01

    As part of the Canadian Nuclear Fuel Waste Management Program (CNFWMP), AECL has developed a three-dimensional finite-element code, MOTIF (Model Of Transport In Fractured/ porous media), for detailed modelling of groundwater flow, heat transport and solute transport in a fractured rock mass. The code solves the transient and steady-state equations of groundwater flow, solute (including one-species radionuclide) transport, and heat transport in variably saturated fractured/porous media. The initial development was completed in 1985 (Guvanasen 1985) and version 3.0 was completed in 1986. This version is documented in detail in Guvanasen and Chan (in preparation). This report describes a series of fourteen verification cases which has been used to test the numerical solution techniques and coding of MOTIF, as well as demonstrate some of the MOTIF analysis capabilities. For each case the MOTIF solution has been compared with a corresponding analytical or independently developed alternate numerical solution. Several of the verification cases were included in Level 1 of the International Hydrologic Code Intercomparison Project (HYDROCOIN). The MOTIF results for these cases were also described in the HYDROCOIN Secretariat's compilation and comparison of results submitted by the various project teams (Swedish Nuclear Power Inspectorate 1988). It is evident from the graphical comparisons presented that the MOTIF solutions for the fourteen verification cases are generally in excellent agreement with known analytical or numerical solutions obtained from independent sources. This series of verification studies has established the ability of the MOTIF finite-element code to accurately model the groundwater flow and solute and heat transport phenomena for which it is intended. (author). 20 refs., 14 tabs., 32 figs

  7. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  8. Optimal background matching camouflage.

    Science.gov (United States)

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  9. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  10. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  11. Numerical relativity

    CERN Document Server

    Nakamura, T

    1993-01-01

    In GR13 we heard many reports on recent. progress as well as future plans of detection of gravitational waves. According to these reports (see the report of the workshop on the detection of gravitational waves by Paik in this volume), it is highly probable that the sensitivity of detectors such as laser interferometers and ultra low temperature resonant bars will reach the level of h ~ 10—21 by 1998. in this level we may expect the detection of the gravitational waves from astrophysical sources such as coalescing binary neutron stars once a year or so. Therefore the progress in numerical relativity is urgently required to predict the wave pattern and amplitude of the gravitational waves from realistic astrophysical sources. The time left for numerical relativists is only six years or so although there are so many difficulties in principle as well as in practice.

  12. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    uses the 'Actel Libero IDE' (internally, 'Synopsys Synplify Pro') to synthesize Net list from the Verilog program, and also uses the 'EDIFtoBLIF-MV' to translate Net list into BLIF-MV. The VIS verification system is then used to prove the behavioral equivalence. This paper is organized as follows: Section 2 provides background information. Section 3 explains the developed tool, which translates EDIF to BLIF-MV. A case study with Verilog examples of a Korean nuclear power plant is presented in Section 4 and Section 5 concludes the paper and provides remarks on future research extension. This paper proposes a formal verification technique which can contribute to the correctness demonstration of commercial FPGA synthesis processes and tools in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog program. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIF-MV,' which translate EDIF into BLIF-MV, while preserving their behavior equivalence. The translation from EDIF into BLIF-MV consists of three steps Parsing, Pro-processing and Translation. We performed the case study with Verilog programs designed for a digital I and C system in Korea

  13. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    uses the 'Actel Libero IDE' (internally, 'Synopsys Synplify Pro') to synthesize Net list from the Verilog program, and also uses the 'EDIFtoBLIF-MV' to translate Net list into BLIF-MV. The VIS verification system is then used to prove the behavioral equivalence. This paper is organized as follows: Section 2 provides background information. Section 3 explains the developed tool, which translates EDIF to BLIF-MV. A case study with Verilog examples of a Korean nuclear power plant is presented in Section 4 and Section 5 concludes the paper and provides remarks on future research extension. This paper proposes a formal verification technique which can contribute to the correctness demonstration of commercial FPGA synthesis processes and tools in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog program. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIF-MV,' which translate EDIF into BLIF-MV, while preserving their behavior equivalence. The translation from EDIF into BLIF-MV consists of three steps Parsing, Pro-processing and Translation. We performed the case study with Verilog programs designed for a digital I and C system in Korea.

  14. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  15. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  16. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  17. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  18. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  19. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  20. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  1. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Budzien, Joanne Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Harwell, Megan Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hickmann, Kyle Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Israel, Daniel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Magrogan, William Richard III [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Singleton, Jr., Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Srinivasan, Gowri [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walter, Jr, John William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Woods, Charles Nathan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents serve as the compilation of results demonstrating accomplishment of these objectives.

  2. Cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Wilson, R.W.

    1979-01-01

    The 20-ft horn-reflector antenna at Bell Laboratories is discussed in detail with emphasis on the 7.35 cm radiometer. The circumstances leading to the detection of the cosmic microwave background radiation are explored

  3. Zambia Country Background Report

    DEFF Research Database (Denmark)

    Hampwaye, Godfrey; Jeppesen, Søren; Kragelund, Peter

    This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change).......This paper provides background data and general information for the Zambia studies focusing on local food processing sub­‐sector; and the local suppliers to the mines as part of the SAFIC project (Successful African Firms and Institutional Change)....

  4. NEUTRON ALGORITHM VERIFICATION TESTING

    International Nuclear Information System (INIS)

    COWGILL, M.; MOSBY, W.; ARGONNE NATIONAL LABORATORY-WEST

    2000-01-01

    Active well coincidence counter assays have been performed on uranium metal highly enriched in 235 U. The data obtained in the present program, together with highly enriched uranium (HEU) metal data obtained in other programs, have been analyzed using two approaches, the standard approach and an alternative approach developed at BNL. Analysis of the data with the standard approach revealed that the form of the relationship between the measured reals and the 235 U mass varied, being sometimes linear and sometimes a second-order polynomial. In contrast, application of the BNL algorithm, which takes into consideration the totals, consistently yielded linear relationships between the totals-corrected reals and the 235 U mass. The constants in these linear relationships varied with geometric configuration and level of enrichment. This indicates that, when the BNL algorithm is used, calibration curves can be established with fewer data points and with more certainty than if a standard algorithm is used. However, this potential advantage has only been established for assays of HEU metal. In addition, the method is sensitive to the stability of natural background in the measurement facility

  5. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  6. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.

  7. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  8. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  9. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  10. The natural radiation background

    International Nuclear Information System (INIS)

    Duggleby, J.C.

    1982-01-01

    The components of the natural background radiation and their variations are described. Cosmic radiation is a major contributor to the external dose to the human body whilst naturally-occurring radionuclides of primordial and cosmogenic origin contribute to both the external and internal doses, with the primordial radionuclides being the major contributor in both cases. Man has continually modified the radiation dose to which he has been subjected. The two traditional methods of measuring background radiation, ionisation chamber measurements and scintillation counting, are looked at and the prospect of using thermoluminescent dosimetry is considered

  11. Effects of background radiation

    International Nuclear Information System (INIS)

    Knox, E.G.; Stewart, A.M.; Gilman, E.A.; Kneale, G.W.

    1987-01-01

    The primary objective of this investigation is to measure the relationship between exposure to different levels of background gamma radiation in different parts of the country, and different Relative Risks for leukaemias and cancers in children. The investigation is linked to an earlier analysis of the effects of prenatal medical x-rays upon leukaemia and cancer risk; the prior hypothesis on which the background-study was based, is derived from the earlier results. In a third analysis, the authors attempted to measure varying potency of medical x-rays delivered at different stages of gestation and the results supply a link between the other two estimates. (author)

  12. The cosmic microwave background

    International Nuclear Information System (INIS)

    Silk, J.

    1991-01-01

    Recent limits on spectral distortions and angular anisotropies in the cosmic microwave background are reviewed. The various backgrounds are described, and the theoretical implications are assessed. Constraints on inflationary cosmology dominated by cold dark matter (CDM) and on open cosmological models dominated by baryonic dark matter (BDM), with, respectively, primordial random phase scale-invariant curvature fluctuations or non-gaussian isocurvature fluctuations are described. More exotic theories are addressed, and I conclude with the 'bottom line': what theories expect experimentalists to be measuring within the next two to three years without having to abandon their most cherished theorists. (orig.)

  13. The Cosmic Background Explorer

    Science.gov (United States)

    Gulkis, Samuel; Lubin, Philip M.; Meyer, Stephan S.; Silverberg, Robert F.

    1990-01-01

    The Cosmic Background Explorer (CBE), NASA's cosmological satellite which will observe a radiative relic of the big bang, is discussed. The major questions connected to the big bang theory which may be clarified using the CBE are reviewed. The satellite instruments and experiments are described, including the Differential Microwave Radiometer, which measures the difference between microwave radiation emitted from two points on the sky, the Far-Infrared Absolute Spectrophotometer, which compares the spectrum of radiation from the sky at wavelengths from 100 microns to one cm with that from an internal blackbody, and the Diffuse Infrared Background Experiment, which searches for the radiation from the earliest generation of stars.

  14. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  15. Thermal background noise limitations

    Science.gov (United States)

    Gulkis, S.

    1982-01-01

    Modern detection systems are increasingly limited in sensitivity by the background thermal photons which enter the receiving system. Expressions for the fluctuations of detected thermal radiation are derived. Incoherent and heterodyne detection processes are considered. References to the subject of photon detection statistics are given.

  16. Berkeley Low Background Facility

    International Nuclear Information System (INIS)

    Thomas, K. J.; Norman, E. B.; Smith, A. R.; Poon, A. W. P.; Chan, Y. D.; Lesko, K. T.

    2015-01-01

    The Berkeley Low Background Facility (BLBF) at Lawrence Berkeley National Laboratory (LBNL) in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background laboratory on the surface at LBNL and at the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products; active screening via neutron activation analysis for U,Th, and K as well as a variety of stable isotopes; and neutron flux/beam characterization measurements through the use of monitors. A general overview of the facilities, services, and sensitivities will be presented. Recent activities and upgrades will also be described including an overview of the recently installed counting system at SURF (recently relocated from Oroville, CA in 2014), the installation of a second underground counting station at SURF in 2015, and future plans. The BLBF is open to any users for counting services or collaboration on a wide variety of experiments and projects

  17. The Cosmic Microwave Background

    Directory of Open Access Journals (Sweden)

    Jones Aled

    1998-01-01

    Full Text Available We present a brief review of current theory and observations of the cosmic microwave background (CMB. New predictions for cosmological defect theories and an overview of the inflationary theory are discussed. Recent results from various observations of the anisotropies of the microwave background are described and a summary of the proposed experiments is presented. A new analysis technique based on Bayesian statistics that can be used to reconstruct the underlying sky fluctuations is summarised. Current CMB data is used to set some preliminary constraints on the values of fundamental cosmological parameters $Omega$ and $H_circ$ using the maximum likelihood technique. In addition, secondary anisotropies due to the Sunyaev-Zel'dovich effect are described.

  18. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  19. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  20. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  1. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  2. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  3. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  4. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  5. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  6. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  7. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  8. Family Background and Entrepreneurship

    DEFF Research Database (Denmark)

    Lindquist, Matthew J.; Sol, Joeri; Van Praag, Mirjam

    Vast amounts of money are currently being spent on policies aimed at promoting entrepreneurship. The success of such policies, however, rests in part on the assumption that individuals are not ‘born entrepreneurs’. In this paper, we assess the importance of family background and neighborhood...... effects as determinants of entrepreneurship. We start by estimating sibling correlations in entrepreneurship. We find that between 20 and 50 percent of the variance in different entrepreneurial outcomes is explained by factors that siblings share. The average is 28 percent. Allowing for differential...... entrepreneurship does play a large role, as do shared genes....

  9. Malaysia; Background Paper

    OpenAIRE

    International Monetary Fund

    1996-01-01

    This Background Paper on Malaysia examines developments and trends in the labor market since the mid-1980s. The paper describes the changes in the employment structure and the labor force. It reviews wages and productivity trends and their effects on unit labor cost. The paper highlights that Malaysia’s rapid growth, sustained since 1987, has had a major impact on the labor market. The paper outlines the major policy measures to address the labor constraints. It also analyzes Malaysia’s recen...

  10. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  11. Phantoms for IMRT dose distribution measurement and treatment verification

    International Nuclear Information System (INIS)

    Low, Daniel A.; Gerber, Russell L.; Mutic, Sasa; Purdy, James A.

    1998-01-01

    Background: The verification of intensity-modulated radiation therapy (IMRT) patient treatment dose distributions is currently based on custom-built or modified dose measurement phantoms. The only commercially available IMRT treatment planning and delivery system (Peacock, NOMOS Corp.) is supplied with a film phantom that allows accurate spatial localization of the dose distribution using radiographic film. However, measurements using other dosimeters are necessary for the thorough verification of IMRT. Methods: We have developed a phantom to enable dose measurements using a cylindrical ionization chamber and the localization of prescription isodose curves using a matrix of thermoluminescent dosimetry (TLD) chips. The external phantom cross-section is identical to that of the commercial phantom, to allow direct comparisons of measurements. A supplementary phantom has been fabricated to verify the IMRT dose distributions for pelvis treatments. Results: To date, this phantom has been used for the verification of IMRT dose distributions for head and neck and prostate cancer treatments. Designs are also presented for a phantom insert to be used with polymerizing gels (e.g., BANG-2) to obtain volumetric dose distribution measurements. Conclusion: The phantoms have proven useful in the quantitative evaluation of IMRT treatments

  12. An international cooperative verification agenda for arms reduction

    International Nuclear Information System (INIS)

    Hinderstein, C.

    2013-01-01

    The biggest challenge to the overall verification and monitoring agenda for future arms reductions may be that posed by uncertainties regarding the quantities of existing stocks of fissile material and nuclear weapons. We must develop strategies to reduce the residual uncertainties regarding completeness of initial declarations as all declared weapons-related inventories go to zero. Establishing this confidence in countries' initial baseline declarations will likely be a key point in all states' decisions to move to very low numbers, much less zero. The author reviews the questions and challenges that need to be addressed if there is to be significant progress in negotiating and implementing a verifiable fissile material cutoff treaty (FMCT) and a policy of nuclear weapon dismantling. In support of greater security as the world works towards the elimination of nuclear weapons, individual States could begin immediately by increasing the transparency of their nuclear activities. The International Verification Project is designed to bring experts from a wide array of related backgrounds together to build capacity for verification internationally in support of arms control goals (and in support of the larger objective of a world without nuclear weapons), build confidence between nuclear and non-nuclear-weapon states, promote freer flow of information among governments and between governments and non-governmental organizations (NGOs) and solve technical problems that could be barriers to progress. The paper is followed by the slides of the presentation. (A.C.)

  13. Background-cross-section-dependent subgroup parameters

    International Nuclear Information System (INIS)

    Yamamoto, Toshihisa

    2003-01-01

    A new set of subgroup parameters was derived that can reproduce the self-shielded cross section against a wide range of background cross sections. The subgroup parameters are expressed with a rational equation which numerator and denominator are expressed as the expansion series of background cross section, so that the background cross section dependence is exactly taken into account in the parameters. The advantage of the new subgroup parameters is that they can reproduce the self-shielded effect not only by group basis but also by subgroup basis. Then an adaptive method is also proposed which uses fitting procedure to evaluate the background-cross-section-dependence of the parameters. One of the simple fitting formula was able to reproduce the self-shielded subgroup cross section by less than 1% error from the precise evaluation. (author)

  14. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  15. 22 CFR 97.3 - Requirements subject to verification in an outgoing Convention case.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Requirements subject to verification in an outgoing Convention case. 97.3 Section 97.3 Foreign Relations DEPARTMENT OF STATE LEGAL AND RELATED... background study. An accredited agency, temporarily accredited agency, or public domestic authority must...

  16. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  18. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  19. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  20. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  1. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  2. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  3. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  4. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  5. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  6. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  7. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  8. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  9. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  10. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  11. Backgrounded but not peripheral

    DEFF Research Database (Denmark)

    Hovmark, Henrik

    2013-01-01

    .e. the schema enters into apparently contradictory constructions of the informants’ local home-base and, possibly, of their identity (cf. Hovmark, 2010). Second, I discuss the status and role of the specific linguistic category in question, i.e. the directional adverbs. On the one hand we claim that the DDAs......In this paper I pay a closer look at the use of the CENTRE-PERIPHERY schema in context. I address two specific issues: first, I show how the CENTRE-PERIPHERY schema, encoded in the DDAs, enters into discourses that conceptualize and characterize a local community as both CENTRE and PERIPHERY, i......; furthermore, the DDAs are backgrounded in discourse. Is it reasonable to claim, rather boldly, that “the informants express their identity in the use of the directional adverb ud ‘out’ etc.”? In the course of this article, however, I suggest that the DDAs in question do contribute to the socio...

  12. OCRWM Backgrounder, January 1987

    International Nuclear Information System (INIS)

    1987-01-01

    The Nuclear Waste Policy Act of 1982 (NWPA) assigns to the US Department of Energy (DOE) responsibility for developing a system to safely and economically transport spent nuclear fuel and high-level radioactive waste from various storage sites to geologic repositories or other facilities that constitute elements of the waste management program. This transportation system will evolve from technologies and capabilities already developed. Shipments of spent fuel to a monitored retrievable storage (MRS) facility could begin as early as 1996 if Congress authorizes its construction. Shipments of spent fuel to a geologic repository are scheduled to begin in 1998. The backgrounder provides an overview of DOE's cask development program. Transportation casks are a major element in the DOE nuclear waste transportation system because they are the primary protection against any potential radiation exposure to the public and transportation workers in the event an accident occurs

  13. Monitored background radiometer

    International Nuclear Information System (INIS)

    Ruel, C.

    1988-01-01

    A monitored background radiometer is described comprising: a thermally conductive housing; low conductivity support means mounted on the housing; a sensing plate mounted on the low conductivity support means and spaced from the housing so as to be thermally insulated from the housing and having an outwardly facing first surface; the sensing plate being disposed relative to the housing to receive direct electromagnetic radiation from sources exterior to the radiometer upon the first surface only; means for controllably heating the sensing plate; first temperature sensitive means to measure the temperature of the housing; and second temperature sensitive means to measure the temperature of the sensing plate, so that the heat flux at the sensing plate may be determined from the temperatures of the housing and sensing plate after calibration of the radiometer by measuring the temperatures of the housing and sensing plate while controllably heating the sensing plate

  14. Spontaneous Radiation Background Calculation for LCLS

    CERN Document Server

    Reiche, Sven

    2004-01-01

    The intensity of undulator radiation, not amplified by the FEL interaction, can be larger than the maximum FEL signal in the case of an X-ray FEL. In the commissioning of a SASE FEL it is essential to extract an amplified signal early to diagnose eventual misalignment of undulator modules or errors in the undulator field strength. We developed a numerical code to calculate the radiation pattern at any position behind a multi-segmented undulator with arbitrary spacing and field profiles. The output can be run through numerical spatial and frequency filters to model the radiation beam transport and diagnostic. In this presentation we estimate the expected background signal for the FEL diagnostic and at what point along the undulator the FEL signal can be separated from the background. We also discusses how much information on the undulator field and alignment can be obtained from the incoherent radiation signal itself.

  15. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  16. Low background infrared (LBIR) facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Low background infrared (LBIR) facility was originally designed to calibrate user supplied blackbody sources and to characterize low-background IR detectors and...

  17. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    International Nuclear Information System (INIS)

    Weaver, Phyllis C.

    2012-01-01

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site's conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse

  18. Hanford Site background: Part 1, Soil background for nonradioactive analytes

    International Nuclear Information System (INIS)

    1993-04-01

    Volume two contains the following appendices: Description of soil sampling sites; sampling narrative; raw data soil background; background data analysis; sitewide background soil sampling plan; and use of soil background data for the detection of contamination at waste management unit on the Hanford Site

  19. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  20. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  1. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  2. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  3. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  4. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  5. Note on bouncing backgrounds

    Science.gov (United States)

    de Haro, Jaume; Pan, Supriya

    2018-05-01

    The theory of inflation is one of the fundamental and revolutionary developments of modern cosmology that became able to explain many issues of the early Universe in the context of the standard cosmological model (SCM). However, the initial singularity of the Universe, where physics is indefinite, is still obscure in the combined SCM +inflation scenario. An alternative to SCM +inflation without the initial singularity is thus always welcome, and bouncing cosmology is an attempt at that. The current work is thus motivated to investigate the bouncing solutions in modified gravity theories when the background universe is described by the spatially flat Friedmann-Lemaître-Robertson-Walker (FLRW) geometry. We show that the simplest way to obtain the bouncing cosmologies in such spacetime is to consider some kind of Lagrangian whose gravitational sector depends only on the square of the Hubble parameter of the FLRW universe. For these modified Lagrangians, the corresponding Friedmann equation, a constraint in the dynamics of the Universe, depicts a curve in the phase space (H ,ρ ), where H is the Hubble parameter and ρ is the energy density of the Universe. As a consequence, a bouncing cosmology is obtained when this curve is closed and crosses the axis H =0 at least twice, and whose simplest particular example is the ellipse depicting the well-known holonomy corrected Friedmann equation in loop quantum cosmology (LQC). Sometimes, a crucial point in such theories is the appearance of the Ostrogradski instability at the perturbative level; however, fortunately enough, in the present work, as long as the linear level of perturbations is concerned, this instability does not appear, although it may appear at the higher order of perturbations.

  6. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  7. Numerical linear algebra with applications using Matlab

    CERN Document Server

    Ford, William

    2014-01-01

    Designed for those who want to gain a practical knowledge of modern computational techniques for the numerical solution of linear algebra problems, Numerical Linear Algebra with Applications contains all the material necessary for a first year graduate or advanced undergraduate course on numerical linear algebra with numerous applications to engineering and science. With a unified presentation of computation, basic algorithm analysis, and numerical methods to compute solutions, this book is ideal for solving real-world problems. It provides necessary mathematical background information for

  8. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  9. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  10. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  11. Transitioning Enhanced Land Surface Initialization and Model Verification Capabilities to the Kenya Meteorological Department (KMD)

    Science.gov (United States)

    Case, Jonathan L.; Mungai, John; Sakwa, Vincent; Zavodsky, Bradley T.; Srikishen, Jayanthi; Limaye, Ashutosh; Blankenship, Clay B.

    2016-01-01

    Flooding, severe weather, and drought are key forecasting challenges for the Kenya Meteorological Department (KMD), based in Nairobi, Kenya. Atmospheric processes leading to convection, excessive precipitation and/or prolonged drought can be strongly influenced by land cover, vegetation, and soil moisture content, especially during anomalous conditions and dry/wet seasonal transitions. It is thus important to represent accurately land surface state variables (green vegetation fraction, soil moisture, and soil temperature) in Numerical Weather Prediction (NWP) models. The NASA SERVIR and the Short-term Prediction Research and Transition (SPoRT) programs in Huntsville, AL have established a working partnership with KMD to enhance its regional modeling capabilities. SPoRT and SERVIR are providing experimental land surface initialization datasets and model verification capabilities for capacity building at KMD. To support its forecasting operations, KMD is running experimental configurations of the Weather Research and Forecasting (WRF; Skamarock et al. 2008) model on a 12-km/4-km nested regional domain over eastern Africa, incorporating the land surface datasets provided by NASA SPoRT and SERVIR. SPoRT, SERVIR, and KMD participated in two training sessions in March 2014 and June 2015 to foster the collaboration and use of unique land surface datasets and model verification capabilities. Enhanced regional modeling capabilities have the potential to improve guidance in support of daily operations and high-impact weather and climate outlooks over Eastern Africa. For enhanced land-surface initialization, the NASA Land Information System (LIS) is run over Eastern Africa at 3-km resolution, providing real-time land surface initialization data in place of interpolated global model soil moisture and temperature data available at coarser resolutions. Additionally, real-time green vegetation fraction (GVF) composites from the Suomi-NPP VIIRS instrument is being incorporated

  12. Dose concentration and dose verification for radiotherapy of cancer

    International Nuclear Information System (INIS)

    Maruyama, Koichi

    2005-01-01

    The number of cancer treatments using radiation therapy is increasing. The background of this increase is the accumulated fact that the number of successful cases is comparative to or even better than surgery for some types of cancer due to the improvement in irradiation technology and radiation planning technology. This review describes the principles and technology of radiation therapy, its characteristics, particle therapy that improves the dose concentration, its historical background, the importance of dose concentration, present situation and future possibilities. There are serious problems that hinder the superior dose concentration of particle therapy. Recent programs and our efforts to solve these problems are described. A new concept is required to satisfy the notion of evidence based medicine, i.e., one has to develop a method of dose verification, which is not yet available. This review is for researchers, medical doctors and radiation technologists who are developing this field. (author)

  13. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    International Nuclear Information System (INIS)

    Visser, Ruurd; Wauben, David J.L.; Groot, Martijn de; Steenbakkers, Roel J.H.M.; Bijl, Henk P.; Godart, Jeremy; Veld, Aart A. van’t; Langendijk, Johannes A.; Korevaar, Erik W.

    2014-01-01

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for this. The aim of this study was to evaluate DVH-based treatment plan verification in addition to the GI evaluation method for head and neck IMRT. Materials and methods: Dose verifications of 700 subsequent head and neck cancer IMRT treatment plans were categorised according to gamma and DVH-based action levels. Fractionation dependent absolute dose limits were chosen. The results of the gamma- and DVH-based evaluations were compared to the decision of the medical physicist and/or radiation oncologist for plan acceptance. Results: Nearly all treatment plans (99.7%) were accepted for treatment according to the GI evaluation combined with DVH-based verification. Two treatment plans were re-planned according to DVH-based verification, which would have been accepted using the evaluation alone. DVH-based verification increased insight into dose delivery to patient specific structures increasing confidence that the treatment plans were clinically acceptable. Moreover, DVH-based action levels clearly distinguished the role of the medical physicist and radiation oncologist within the Quality Assurance (QA) procedure. Conclusions: DVH-based treatment plan verification complements the GI evaluation method improving head and neck IMRT-QA

  14. Numerical investigation of the late-time Kerr tails

    Energy Technology Data Exchange (ETDEWEB)

    Racz, Istvan; Toth, Gabor Zs, E-mail: iracz@rmki.kfki.hu, E-mail: tgzs@rmki.kfki.hu [RMKI, H-1121 Budapest, Konkoly Thege Miklos ut 29-33 (Hungary)

    2011-10-07

    The late-time behavior of a scalar field on fixed Kerr background is examined in a numerical framework incorporating the techniques of conformal compactification and hyperbolic initial value formulation. The applied code is 1+(1+2) as it is based on the use of the spectral method in the angular directions while in the time-radial section fourth order finite differencing, along with the method of lines, is applied. The evolution of various types of stationary and non-stationary pure multipole initial states are investigated. The asymptotic decay rates are determined not only in the domain of outer communication but along the event horizon and at future null infinity as well. The decay rates are found to be different for stationary and non-stationary initial data, and they also depend on the fall off properties of the initial data toward future null infinity. The energy and angular momentum transfers are found to show significantly different behavior in the initial phase of the time evolution. The quasinormal ringing phase and the tail phase are also investigated. In the tail phase, the decay exponents for the energy and angular momentum losses at I{sup +} are found to be smaller than at the horizon which is in accordance with the behavior of the field itself and it means that at late times the energy and angular momentum falling into the black hole become negligible in comparison with the energy and angular momentum radiated toward I{sup +}. The energy and angular momentum balances are used as additional verifications of the reliability of our numerical method.

  15. Numerical investigation of the late-time Kerr tails

    International Nuclear Information System (INIS)

    Racz, Istvan; Toth, Gabor Zs

    2011-01-01

    The late-time behavior of a scalar field on fixed Kerr background is examined in a numerical framework incorporating the techniques of conformal compactification and hyperbolic initial value formulation. The applied code is 1+(1+2) as it is based on the use of the spectral method in the angular directions while in the time-radial section fourth order finite differencing, along with the method of lines, is applied. The evolution of various types of stationary and non-stationary pure multipole initial states are investigated. The asymptotic decay rates are determined not only in the domain of outer communication but along the event horizon and at future null infinity as well. The decay rates are found to be different for stationary and non-stationary initial data, and they also depend on the fall off properties of the initial data toward future null infinity. The energy and angular momentum transfers are found to show significantly different behavior in the initial phase of the time evolution. The quasinormal ringing phase and the tail phase are also investigated. In the tail phase, the decay exponents for the energy and angular momentum losses at I + are found to be smaller than at the horizon which is in accordance with the behavior of the field itself and it means that at late times the energy and angular momentum falling into the black hole become negligible in comparison with the energy and angular momentum radiated toward I + . The energy and angular momentum balances are used as additional verifications of the reliability of our numerical method.

  16. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    Directory of Open Access Journals (Sweden)

    Page Michel

    2009-12-01

    Full Text Available Abstract Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks.

  17. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  18. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  19. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  20. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  2. Executive Summary - Historical background

    International Nuclear Information System (INIS)

    2005-01-01

    matter physics experiments at the High Flux Reactor of The Laue Langevin Institute and the ISIS spallation source at Rutherford-Appleton. Recently, we very actively entered the ICARUS neutrino collaboration and were invited to the PIERRE AUGER collaboration which will search for the highest energies in the Universe. Having close ties with CERN we are very actively engaged in CROSS-GRID, a large computer network project. To better understand the historical background of the INP development, it is necessary to add a few comments on financing of science in Poland. During the 70's and the 80's, research was financed through the so-called Central Research Projects for Science and Technical Development. The advantage of this system was that state-allocated research funds were divided only by a few representatives of the scientific community, which allowed realistic allocation of money to a small number of projects. After 1989 we were able to purchase commercially available equipment, which led to the closure of our large and very experienced electronic workshop. We also considerably reduced our well equipped mechanical shop. During the 90's the reduced state financing of science was accompanied by a newly established Committee of Scientific Research which led to the creation of a system of small research projects. This precluded the development of more ambitious research projects and led to the dispersion of equipment among many smaller laboratories and universities. A large research establishment, such as our Institute, could not develop properly under such conditions. In all, between 1989 and 2004 we reduced our personnel from about 800 to 470 and our infrastructure became seriously undercapitalised. However, with energetic search for research funds, from European rather than national research programs, we hope to improve and modernize our laboratories and their infrastructure in the coming years

  3. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Irith De Baetselier

    2016-10-01

    Full Text Available Background: Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs. Verification of CRRs is time consuming and often requires a statistical background. Objectives: We report on an easy and cost-saving method to verify CRRs. Methods: Using a former method introduced by Sigma Diagnostics, three study sites in sub- Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results: Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion: To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  4. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  5. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  6. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  7. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  8. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  9. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  10. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  11. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  12. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  13. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  14. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  15. Background noise levels in Europe

    OpenAIRE

    Gjestland, Truls

    2008-01-01

    - This report gives a brief overview of typical background noise levels in Europe, and suggests a procedure for the prediction of background noise levels based on population density. A proposal for the production of background noise maps for Europe is included.

  16. Numerical analysis of electromagnetic fields

    CERN Document Server

    Zhou Pei Bai

    1993-01-01

    Numerical methods for solving boundary value problems have developed rapidly. Knowledge of these methods is important both for engineers and scientists. There are many books published that deal with various approximate methods such as the finite element method, the boundary element method and so on. However, there is no textbook that includes all of these methods. This book is intended to fill this gap. The book is designed to be suitable for graduate students in engineering science, for senior undergraduate students as well as for scientists and engineers who are interested in electromagnetic fields. Objective Numerical calculation is the combination of mathematical methods and field theory. A great number of mathematical concepts, principles and techniques are discussed and many computational techniques are considered in dealing with practical problems. The purpose of this book is to provide students with a solid background in numerical analysis of the field problems. The book emphasizes the basic theories ...

  17. Numerical methods: Analytical benchmarking in transport theory

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1988-01-01

    Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered

  18. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  19. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  20. Numerical Optimization in Microfluidics

    DEFF Research Database (Denmark)

    Jensen, Kristian Ejlebjærg

    2017-01-01

    Numerical modelling can illuminate the working mechanism and limitations of microfluidic devices. Such insights are useful in their own right, but one can take advantage of numerical modelling in a systematic way using numerical optimization. In this chapter we will discuss when and how numerical...... optimization is best used....

  1. Backgrounder

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    Center for Mountain Ecosystem Studies, Kunming Institute of Botany of the Chinese Academy of Sciences, China: $1,526,000 to inform effective water governance in the Asian highlands of China, Nepal, and Pakistan. • Ashoka Trust for Research in Ecology and the Environment (ATREE), India: $1,499,300 for research on ...

  2. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    demographic trends, socio-economic development pathways, and strong ... knowledge and experience, and encourage innovation. ... choices, and will work with stakeholders in government, business, civil society, and regional economic.

  3. Backgrounder

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    Safe and Inclusive Cities: ... improving urban environments and public spaces might have on reducing the city's high ... violence against women among urban youth of working class neighbourhoods of Islamabad, Rawalpindi, and Karachi,.

  4. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    IDRC CRDI

    CARIAA's research agenda addresses gaps and priorities highlighted in the ... Research focuses on climate risk, institutional and regulatory frameworks, markets, and ... The researchers will identify relevant drivers and trends and use develop ...

  5. BACKGROUNDER

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    achieving long‐term food security in Africa, with a focus on post‐harvest loss, ... nutrion and health, and the socio‐economic factors that affect food supply ... Water use. Agricultural producvity in sub‐Saharan Africa is the lowest in the world.

  6. Emergence of oscillons in an expanding background

    International Nuclear Information System (INIS)

    Farhi, E.; Guth, A. H.; Iqbal, N.; Graham, N.; Rosales, R. R.; Stamatopoulos, N.

    2008-01-01

    We consider a (1+1) dimensional scalar field theory that supports oscillons, which are localized, oscillatory, stable solutions to nonlinear equations of motion. We study this theory in an expanding background and show that oscillons now lose energy, but at a rate that is exponentially small when the expansion rate is slow. We also show numerically that a universe that starts with (almost) thermal initial conditions will cool to a final state where a significant fraction of the energy of the universe--on the order of 50%--is stored in oscillons. If this phenomenon persists in realistic models, oscillons may have cosmological consequences.

  7. Profiled Deck Composite Slab Strength Verification: A Review

    Directory of Open Access Journals (Sweden)

    K. Mohammed

    2017-12-01

    Full Text Available The purpose of this article is to present an overview on alternate profiled deck composite slab (PDCS strength verification devoid of the expensive and complex laboratory procedures in establishing its longitudinal shear capacity. Despite the several deterministic research findings leading to the development of proposals and modifications on the complex shear characteristics of PDCS that defines its strength behaviour, the laboratory performance testing stands to be the only accurate means for the PDCS strength assessment. The issue is critical and warrants much further thoughts from different perspective other than the deterministic approach that are rather expensive and time consuming. Hence, the development of a rational-based numerical test load function from longitudinal shear capacity consideration is a necessity in augmenting the previous futile attempts for strength determination of PDCS devoid of the costlier and expensive laboratory procedure.

  8. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  9. Natural background radiation in Saudi Arabia

    International Nuclear Information System (INIS)

    Al-Hussan, K.A.; Al-Suliman, K.M.; Wafa, N.F.

    1993-01-01

    Natural background radiation measurements have been made at numerous locations throughout the world. Little work in this field has been done in developing countries. In this study, the external exposure rates due to natural background radiation sources have been measured for different Saudi Arabian cities. Thermoluminescence dosimeters, CaF 2 Dy(TLD-200), has been used for field measurements. Exposure to TLD's response correlations were obtained for each TLD using a 137 Cs source. A correlation of TLD's response fading at a continuous radiation exposure environment was obtained and applied to correct field measurements. The measurements were taken every two months for a total of six intervals during the whole year. The average measurements of outdoor external exposure rates was found to vary between a minimum of 5.29 μR h -1 in Dammam city and a maximum of 11.59 μR h -1 in Al-Khamis city. (1 fig., 1 tab.)

  10. Cosmic microwave background bispectrum from recombination.

    Science.gov (United States)

    Huang, Zhiqi; Vernizzi, Filippo

    2013-03-08

    We compute the cosmic microwave background temperature bispectrum generated by nonlinearities at recombination on all scales. We use CosmoLib2nd, a numerical Boltzmann code at second order to compute cosmic microwave background bispectra on the full sky. We consistently include all effects except gravitational lensing, which can be added to our result using standard methods. The bispectrum is peaked on squeezed triangles and agrees with the analytic approximation in the squeezed limit at the few percent level for all the scales where this is applicable. On smaller scales, we recover previous results on perturbed recombination. For cosmic-variance limited data to l(max)=2000, its signal-to-noise ratio is S/N=0.47, corresponding to f(NL)(eff)=-2.79, and will bias a local signal by f(NL)(loc) ~/= 0.82.

  11. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  12. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  13. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  14. A new approach for the verification of optical systems

    Science.gov (United States)

    Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène

    2013-09-01

    Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Tra­ditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.

  15. Diffuse Cosmic Infrared Background Radiation

    Science.gov (United States)

    Dwek, Eli

    2002-01-01

    The diffuse cosmic infrared background (CIB) consists of the cumulative radiant energy released in the processes of structure formation that have occurred since the decoupling of matter and radiation following the Big Bang. In this lecture I will review the observational data that provided the first detections and limits on the CIB, and the theoretical studies explaining the origin of this background. Finally, I will also discuss the relevance of this background to the universe as seen in high energy gamma-rays.

  16. Background current of radioisotope manometer

    International Nuclear Information System (INIS)

    Vydrik, A.A.

    1987-01-01

    The technique for calculating the main component of the background current of radioisotopic monometers, current from direct collision of ionizing particles and a collector, is described. The reasons for appearance of background photoelectron current are clarified. The most effective way of eliminating background current components is collector protection from the source by a screen made of material with a high gamma-quanta absorption coefficient, such as lead, for example

  17. An Investigation into Solution Verification for CFD-DEM

    Energy Technology Data Exchange (ETDEWEB)

    Fullmer, William D. [National Energy Technology Lab. (NETL), AECOM, Morgantown, WV (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2017-10-01

    This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of an experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing

  18. Background subtraction theory and practice

    CERN Document Server

    Elgammal, Ahmed

    2014-01-01

    Background subtraction is a widely used concept for detection of moving objects in videos. In the last two decades there has been a lot of development in designing algorithms for background subtraction, as well as wide use of these algorithms in various important applications, such as visual surveillance, sports video analysis, motion capture, etc. Various statistical approaches have been proposed to model scene backgrounds. The concept of background subtraction also has been extended to detect objects from videos captured from moving cameras. This book reviews the concept and practice of back

  19. Uncertain Portfolio Selection with Background Risk and Liquidity Constraint

    Directory of Open Access Journals (Sweden)

    Jia Zhai

    2017-01-01

    Full Text Available This paper discusses an uncertain portfolio selection problem with consideration of background risk and asset liquidity. In addition, the transaction costs are also considered. The security returns, background asset return, and asset liquidity are estimated by experienced experts instead of historical data. Regarding them as uncertain variables, a mean-risk model with background risk, liquidity, and transaction costs is proposed for portfolio selection and the crisp forms of the model are provided when security returns obey different uncertainty distributions. Moreover, for better understanding of the impact of background risk and liquidity on portfolio selection, some important theorems are proved. Finally, numerical experiments are presented to illustrate the modeling idea.

  20. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    Science.gov (United States)

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  1. Entanglement verification and its applications in quantum communication

    International Nuclear Information System (INIS)

    Haeseler, Hauke

    2010-01-01

    In this thesis, we investigate the uses of entanglement and its verification in quantum communication. The main object here is to develop a verification procedure which is adaptable to a wide range of applications, and whose implementation has low requirements on experimental resources. We present such a procedure in the form of the Expectation Value Matrix. The structure of this thesis is as follows: Chapters 1 and 2 give a short introduction and background information on quantum theory and the quantum states of light. In particular, we discuss the basic postulates of quantum mechanics, quantum state discrimination, the description of quantum light and the homodyne detector. Chapter 3 gives a brief introduction to quantum information and in particular to entanglement, and we discuss the basics of quantum key distribution and teleportation. The general framework of the Expectation Value Matrix is introduced. The main matter of this thesis is contained in the subsequent three chapters, which describe different quantum communication protocols and the corresponding adaptation of the entanglement verification method. The subject of Chapter 4 is quantum key distribution, where the detection of entanglement is a means of excluding intercept-resend attacks, and the presence of quantum correlations in the raw data is a necessary precondition for the generation of secret key. We investigate a continuous-variable version of the two-state protocol and develop the Expectation Value Matrix method for such qubit-mode systems. Furthermore, we analyse the role of the phase reference with respect to the security of the protocol and raise awareness of a corresponding security threat. For this, we adapt the verification method to different settings of Stokes operator measurements. In Chapter 5, we investigate quantum memory channels and propose a fundamental benchmark for these based on the verification of entanglement. After describing some physical effects which can be used for the

  2. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  3. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  4. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  5. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    OpenAIRE

    Clark, T Justin; ter Riet, Gerben; Coomarasamy, Aravinthan; Khan, Khalid S

    2004-01-01

    Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects). Of these, 16 had immediate histological verification of diagnosis while 11 ha...

  6. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  7. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  8. Visual search in the real world: Color vision deficiency affects peripheral guidance, but leaves foveal verification largely unaffected

    Directory of Open Access Journals (Sweden)

    Günter eKugler

    2015-12-01

    Full Text Available Background: People with color vision deficiencies report numerous limitations in daily life. However, they use basic color terms systematically and in a similar manner as people with people with normal color vision. We hypothesize that a possible explanation for this discrepancy between color perception and behavioral consequences might be found in the gaze behavior of people with color vision deficiency.Methods: A group of participants with color vision deficiencies and a control group performed several search tasks in a naturalistic setting on a lawn.Results: Search performance was similar in both groups in a color-unrelated search task as well as in a search for yellow targets. While searching for red targets, color vision deficient participants exhibited a strongly degraded performance. This was closely matched by the number of fixations on red objects shown by the two groups. Importantly, once they fixated a target, participants with color vision deficiencies exhibited only few identification errors. Conclusions: Participants with color vision deficiencies are not able to enhance their search for red targets on a (green lawn by an efficient guiding mechanism. The data indicate that the impaired guiding is the main influence on search performance, while foveal identification (verification largely unaffected.

  9. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  10. Backgrounds and characteristics of arsonists

    NARCIS (Netherlands)

    Labree, W.; Nijman, H.L.I.; Marle, H.J.C. van; Rassin, E.

    2010-01-01

    The aim of this study was to gain more insight in the backgrounds and characteristics of arsonists. For this, the psychiatric, psychological, personal, and criminal backgrounds of all arsonists (n = 25), sentenced to forced treatment in the maximum security forensic hospital “De Kijvelanden”, were

  11. Measurement of natural background neutron

    CERN Document Server

    Li Jain, Ping; Tang Jin Hua; Tang, E S; Xie Yan Fong

    1982-01-01

    A high sensitive neutron monitor is described. It has an approximate counting rate of 20 cpm for natural background neutrons. The pulse amplitude resolution, sensitivity and direction dependence of the monitor were determined. This monitor has been used for natural background measurement in Beijing area. The yearly average dose is given and compared with the results of KEK and CERN.

  12. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  13. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  14. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  15. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  16. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  17. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  18. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  19. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  20. On the Verification of a WiMax Design Using Symbolic Simulation

    Directory of Open Access Journals (Sweden)

    Gabriela Nicolescu

    2013-07-01

    Full Text Available In top-down multi-level design methodologies, design descriptions at higher levels of abstraction are incrementally refined to the final realizations. Simulation based techniques have traditionally been used to verify that such model refinements do not change the design functionality. Unfortunately, with computer simulations it is not possible to completely check that a design transformation is correct in a reasonable amount of time, as the number of test patterns required to do so increase exponentially with the number of system state variables. In this paper, we propose a methodology for the verification of conformance of models generated at higher levels of abstraction in the design process to the design specifications. We model the system behavior using sequence of recurrence equations. We then use symbolic simulation together with equivalence checking and property checking techniques for design verification. Using our proposed method, we have verified the equivalence of three WiMax system models at different levels of design abstraction, and the correctness of various system properties on those models. Our symbolic modeling and verification experiments show that the proposed verification methodology provides performance advantage over its numerical counterpart.

  1. Calibration And Performance Verification Of LSC Packard 1900TR AFTER REPAIRING

    International Nuclear Information System (INIS)

    Satrio; Evarista-Ristin; Syafalni; Alip

    2003-01-01

    Calibration process and repeated verification of LSC Packard 1900TR at Hydrology Section-P3TlR has been done. In the period of middle 1997 to July 2000, the counting system of the instrument has damaged and repaired for several times. After repairing, the system was recalibrated and then verified. The calibration and verification were conducted by using standard 3 H, 14 C and background unquenched. The result of calibration shows that background count rates of 3 H and 14 C is 12.3 ± 0.79 cpm and 18.24 ± 0.69 cpm respectively; FOM 3 H and 14 C is 285.03 ± 15.95 and 641.06 ± 16.45 respectively; 3 H and 14 C efficiency is 59.13 ± 0.28 % and 95.09 ± 0.31 %. respectively. From the verification data's, the parameter of SIS and tSIE for 14 C is to be in range of limit. And then 3 H and 14 C efficiency is still above minimum limit. Whereas, the background fluctuation still show normal condition. It could be concluded that until now the performance of LSC Packard 1900TR is well condition and could be used for counting. (author)

  2. Aluminum as a source of background in low background experiments

    Energy Technology Data Exchange (ETDEWEB)

    Majorovits, B., E-mail: bela@mppmu.mpg.de [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Abt, I. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany); Laubenstein, M. [Laboratori Nazionali del Gran Sasso, INFN, S.S.17/bis, km 18 plus 910, I-67100 Assergi (Italy); Volynets, O. [MPI fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2011-08-11

    Neutrinoless double beta decay would be a key to understanding the nature of neutrino masses. The next generation of High Purity Germanium experiments will have to be operated with a background rate of better than 10{sup -5} counts/(kg y keV) in the region of interest around the Q-value of the decay. Therefore, so far irrelevant sources of background have to be considered. The metalization of the surface of germanium detectors is in general done with aluminum. The background from the decays of {sup 22}Na, {sup 26}Al, {sup 226}Ra and {sup 228}Th introduced by this metalization is discussed. It is shown that only a special selection of aluminum can keep these background contributions acceptable.

  3. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  4. JEM-X background models

    DEFF Research Database (Denmark)

    Huovelin, J.; Maisala, S.; Schultz, J.

    2003-01-01

    Background and determination of its components for the JEM-X X-ray telescope on INTEGRAL are discussed. A part of the first background observations by JEM-X are analysed and results are compared to predictions. The observations are based on extensive imaging of background near the Crab Nebula...... on revolution 41 of INTEGRAL. Total observing time used for the analysis was 216 502 s, with the average of 25 cps of background for each of the two JEM-X telescopes. JEM-X1 showed slightly higher average background intensity than JEM-X2. The detectors were stable during the long exposures, and weak orbital...... background was enhanced in the central area of a detector, and it decreased radially towards the edge, with a clear vignetting effect for both JEM-X units. The instrument background was weakest in the central area of a detector and showed a steep increase at the very edges of both JEM-X detectors...

  5. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  6. PORFLOW TESTING AND VERIFICATION DOCUMENT

    International Nuclear Information System (INIS)

    Aleman, S

    2007-01-01

    The PORFLOW software package is a comprehensive mathematical model for simulation of multi-phase fluid flow, heat transfer and mass transport in variably saturated porous and fractured media. PORFLOW can simulate transient or steady-state problems in Cartesian or cylindrical geometry. The porous medium may be anisotropic and heterogeneous and may contain discrete fractures or boreholes with the porous matrix. The theoretical models within the code provide a unified treatment of concepts relevant to fluid flow and transport. The main features of PORFLOW that are relevant to Performance Assessment modeling at the Savannah River National Laboratory (SRNL) include variably saturated flow and transport of parent and progeny radionuclides. This document involves testing a relevant sample of problems in PORFLOW and comparing the outcome of the simulations to analytical solutions or other commercial codes. The testing consists of the following four groups. Group 1: Groundwater Flow; Group 2: Contaminant Transport; Group 3: Numerical Dispersion; and Group 4: Keyword Commands

  7. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  8. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  9. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  10. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  11. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  12. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires...

  13. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  14. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  16. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  17. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  18. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  19. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  20. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  1. Zero leakage quantization scheme for biometric verification

    NARCIS (Netherlands)

    Groot, de J.A.; Linnartz, J.P.M.G.

    2011-01-01

    Biometrics gain increasing interest as a solution for many security issues, but privacy risks exist in case we do not protect the stored templates well. This paper presents a new verification scheme, which protects the secrets of the enrolled users. We will show that zero leakage is achieved if

  2. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  3. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  4. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  5. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    International Nuclear Information System (INIS)

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  6. Stochastic backgrounds of gravitational waves

    International Nuclear Information System (INIS)

    Maggiore, M.

    2001-01-01

    We review the motivations for the search for stochastic backgrounds of gravitational waves and we compare the experimental sensitivities that can be reached in the near future with the existing bounds and with the theoretical predictions. (author)

  7. Berkeley Low Background Counting Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Sensitive low background assay detectors and sample analysis are available for non-destructive direct gamma-ray assay of samples. Neutron activation analysis is also...

  8. Spectral characterization of natural backgrounds

    Science.gov (United States)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  9. Cosmic microwave background, where next?

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Ground-based, balloon-borne and space-based experiments will observe the Cosmic Microwave Background in greater details to address open questions about the origin and the evolution of the Universe. In particular, detailed observations the polarization pattern of the Cosmic Microwave Background radiation have the potential to directly probe physics at the GUT scale and illuminate aspects of the physics of the very early Universe.

  10. Numerical analysis of the Anderson localization

    International Nuclear Information System (INIS)

    Markos, P.

    2006-01-01

    The aim of this paper is to demonstrate, by simple numerical simulations, the main transport properties of disordered electron systems. These systems undergo the metal insulator transition when either Fermi energy crosses the mobility edge or the strength of the disorder increases over critical value. We study how disorder affects the energy spectrum and spatial distribution of electronic eigenstates in the diffusive and insulating regime, as well as in the critical region of the metal-insulator transition. Then, we introduce the transfer matrix and conductance, and we discuss how the quantum character of the electron propagation influences the transport properties of disordered samples. In the weakly disordered systems, the weak localization and anti-localization as well as the universal conductance fluctuation are numerically simulated and discussed. The localization in the one dimensional system is described and interpreted as a purely quantum effect. Statistical properties of the conductance in the critical and localized regimes are demonstrated. Special attention is given to the numerical study of the transport properties of the critical regime and to the numerical verification of the single parameter scaling theory of localization. Numerical data for the critical exponent in the orthogonal models in dimension 2 < d ≤ 5 are compared with theoretical predictions. We argue that the discrepancy between the theory and numerical data is due to the absence of the self-averaging of transmission quantities. This complicates the analytical analysis of the disordered systems. Finally, theoretical methods of description of weakly disordered systems are explained and their possible generalization to the localized regime is discussed. Since we concentrate on the one-electron propagation at zero temperature, no effects of electron-electron interaction and incoherent scattering are discussed in the paper (Author)

  11. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  12. Looking for Cosmic Neutrino Background

    Directory of Open Access Journals (Sweden)

    Chiaki eYanagisawa

    2014-06-01

    Full Text Available Since the discovery of neutrino oscillation in atmospheric neutrinos by the Super-Kamiokande experiment in 1998, study of neutrinos has been one of exciting fields in high-energy physics. All the mixing angles were measured. Quests for 1 measurements of the remaining parameters, the lightest neutrino mass, the CP violating phase(s, and the sign of mass splitting between the mass eigenstates m3 and m1, and 2 better measurements to determine whether the mixing angle theta23 is less than pi/4, are in progress in a well-controlled manner. Determining the nature of neutrinos, whether they are Dirac or Majorana particles is also in progress with continuous improvement. On the other hand, although the ideas of detecting cosmic neutrino background have been discussed since 1960s, there has not been a serious concerted effort to achieve this goal. One of the reasons is that it is extremely difficult to detect such low energy neutrinos from the Big Bang. While there has been tremendous accumulation of information on Cosmic Microwave Background since its discovery in 1965, there is no direct evidence for Cosmic Neutrino Background. The importance of detecting Cosmic Neutrino Background is that, although detailed studies of Big Bang Nucleosynthesis and Cosmic Microwave Background give information of the early Universe at ~a few minutes old and ~300 k years old, respectively, observation of Cosmic Neutrino Background allows us to study the early Universe at $sim$ 1 sec old. This article reviews progress made in the past 50 years on detection methods of Cosmic Neutrino Background.

  13. Average-case analysis of numerical problems

    CERN Document Server

    2000-01-01

    The average-case analysis of numerical problems is the counterpart of the more traditional worst-case approach. The analysis of average error and cost leads to new insight on numerical problems as well as to new algorithms. The book provides a survey of results that were mainly obtained during the last 10 years and also contains new results. The problems under consideration include approximation/optimal recovery and numerical integration of univariate and multivariate functions as well as zero-finding and global optimization. Background material, e.g. on reproducing kernel Hilbert spaces and random fields, is provided.

  14. Industrial hardware and software verification with ACL2.

    Science.gov (United States)

    Hunt, Warren A; Kaufmann, Matt; Moore, J Strother; Slobodova, Anna

    2017-10-13

    The ACL2 theorem prover has seen sustained industrial use since the mid-1990s. Companies that have used ACL2 regularly include AMD, Centaur Technology, IBM, Intel, Kestrel Institute, Motorola/Freescale, Oracle and Rockwell Collins. This paper introduces ACL2 and focuses on how and why ACL2 is used in industry. ACL2 is well-suited to its industrial application to numerous software and hardware systems, because it is an integrated programming/proof environment supporting a subset of the ANSI standard Common Lisp programming language. As a programming language ACL2 permits the coding of efficient and robust programs; as a prover ACL2 can be fully automatic but provides many features permitting domain-specific human-supplied guidance at various levels of abstraction. ACL2 specifications and models often serve as efficient execution engines for the modelled artefacts while permitting formal analysis and proof of properties. Crucially, ACL2 also provides support for the development and verification of other formal analysis tools. However, ACL2 did not find its way into industrial use merely because of its technical features. The core ACL2 user/development community has a shared vision of making mechanized verification routine when appropriate and has been committed to this vision for the quarter century since the Computational Logic, Inc., Verified Stack. The community has focused on demonstrating the viability of the tool by taking on industrial projects (often at the expense of not being able to publish much).This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  15. Improved features of MARS 1.4 and verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Don; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-09-01

    MARS 1.4 code has been developed as a basic code frame for multi-dimensional thermal-hydraulic analysis of light water reactor transients. This report describes the newly improved features of MARS 1.4 and their verification results. The new features of MARS 1.4 include the implementation of point kinetics model in the 3D module, the coupled heat structure model, the extension of control functions and input check functions in the 3D module, the implementation of new features of RELAP5/MOD3.2.2 -version, the addition of automatic initialization function for fuel 3-D analysis and the unification of material properties and forcing functions, etc. These features have been implemented in the code in order to extend the code modeling capability and to enhance the user friendliness. Among these features, this report describes the implementation of new features of RELAP5/MOD3.3.3-version such as reflood model and critical heat flux models, etc., the automatic initialization function, the unification of material properties and forcing functions and the other code improvements and error corrections, which were not reported in the previous report. Through the verification calculations, the new features of MARS 1.4 have been verified well implemented in the code. In conclusion, MARS 1.4 code has been developed and verified as implemented in the code. In conclusion, MARS 1.4 code has been developed and verified as a multi-dimensional system thermal-hydraulic analysis tool. And, it can play its role as a basic code frame for the future development of a multi-purpose consolidated code, MARS 2.x, for coupled analysis of multi-dimensional system thermal hydraulics, 3D core kinetics, core CHF and containment as well as for further improvement of thermal-hydraulic and numerical models. 4 refs., 10 figs. (Author)

  16. Measurement techniques for the verification of excess weapons materials

    International Nuclear Information System (INIS)

    Tape, J.W.; Eccleston, G.W.; Yates, M.A.

    1998-01-01

    The end of the superpower arms race has resulted in an unprecedented reduction in stockpiles of deployed nuclear weapons. Numerous proposals have been put forward and actions have been taken to ensure the irreversibility of nuclear arms reductions, including unilateral initiatives such as those made by President Clinton in September 1993 to place fissile materials no longer needed for a deterrent under international inspection, and bilateral and multilateral measures currently being negotiated. For the technologist, there is a unique opportunity to develop the technical means to monitor nuclear materials that have been declared excess to nuclear weapons programs, to provide confidence that reductions are taking place and that the released materials are not being used again for nuclear explosive programs. However, because of the sensitive nature of these materials, a fundamental conflict exists between the desire to know that the bulk materials or weapon components in fact represent evidence of warhead reductions, and treaty commitments and national laws that require the protection of weapons design information. This conflict presents a unique challenge to technologists. The flow of excess weapons materials, from deployed warheads through storage, disassembly, component storage, conversion to bulk forms, and disposition, will be described in general terms. Measurement approaches based on the detection of passive or induced radiation will be discussed along with the requirement to protect sensitive information from release to unauthorized parties. Possible uses of measurement methods to assist in the verification of arms reductions will be described. The concept of measuring attributes of items rather than quantitative mass-based inventory verification will be discussed along with associated information-barrier concepts required to protect sensitive information

  17. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  18. Experimental quantum verification in the presence of temporally correlated noise

    Science.gov (United States)

    Mavadia, S.; Edmunds, C. L.; Hempel, C.; Ball, H.; Roy, F.; Stace, T. M.; Biercuk, M. J.

    2018-02-01

    Growth in the capabilities of quantum information hardware mandates access to techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). Our analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171Yb+ ion-qubit and inject engineered noise (" separators="∝σ^ z ) to probe protocol performance. Experiments on RB validate predictions that measured fidelities over sequences are described by a gamma distribution varying between approximately Gaussian, and a broad, highly skewed distribution for rapidly and slowly varying noise, respectively. Similarly we find a strong gate set dependence of default experimental GST procedures in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σ^ z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σ^ x or σ^ y errors or depolarising noise processes, highlighting the impact of the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.

  19. Numerical linear algebra theory and applications

    CERN Document Server

    Beilina, Larisa; Karchevskii, Mikhail

    2017-01-01

    This book combines a solid theoretical background in linear algebra with practical algorithms for numerical solution of linear algebra problems. Developed from a number of courses taught repeatedly by the authors, the material covers topics like matrix algebra, theory for linear systems of equations, spectral theory, vector and matrix norms combined with main direct and iterative numerical methods, least squares problems, and eigen problems. Numerical algorithms illustrated by computer programs written in MATLAB® are also provided as supplementary material on SpringerLink to give the reader a better understanding of professional numerical software for the solution of real-life problems. Perfect for a one- or two-semester course on numerical linear algebra, matrix computation, and large sparse matrices, this text will interest students at the advanced undergraduate or graduate level.

  20. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    Directory of Open Access Journals (Sweden)

    Coomarasamy Aravinthan

    2004-05-01

    Full Text Available Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects. Of these, 16 had immediate histological verification of diagnosis while 11 had verification delayed > 24 hrs after testing. The effect of delay in verification of diagnosis on estimates of accuracy was evaluated using meta-regression with diagnostic odds ratio (dOR as the accuracy measure. This analysis was adjusted for study quality and type of test (miniature endometrial biopsy or endometrial ultrasound. Results Compared to studies with immediate verification of diagnosis (dOR 67.2, 95% CI 21.7–208.8, those with delayed verification (dOR 16.2, 95% CI 8.6–30.5 underestimated the diagnostic accuracy by 74% (95% CI 7%–99%; P value = 0.048. Conclusion Among studies of miniature endometrial biopsy and endometrial ultrasound, diagnostic accuracy is considerably underestimated if there is a delay in histological verification of diagnosis.

  1. Neutron background estimates in GESA

    Directory of Open Access Journals (Sweden)

    Fernandes A.C.

    2014-01-01

    Full Text Available The SIMPLE project looks for nuclear recoil events generated by rare dark matter scattering interactions. Nuclear recoils are also produced by more prevalent cosmogenic neutron interactions. While the rock overburden shields against (μ,n neutrons to below 10−8 cm−2 s−1, it itself contributes via radio-impurities. Additional shielding of these is similar, both suppressing and contributing neutrons. We report on the Monte Carlo (MCNP estimation of the on-detector neutron backgrounds for the SIMPLE experiment located in the GESA facility of the Laboratoire Souterrain à Bas Bruit, and its use in defining additional shielding for measurements which have led to a reduction in the extrinsic neutron background to ∼ 5 × 10−3 evts/kgd. The calculated event rate induced by the neutron background is ∼ 0,3 evts/kgd, with a dominant contribution from the detector container.

  2. LOFT gamma densitometer background fluxes

    International Nuclear Information System (INIS)

    Grimesey, R.A.; McCracken, R.T.

    1978-01-01

    Background gamma-ray fluxes were calculated at the location of the γ densitometers without integral shielding at both the hot-leg and cold-leg primary piping locations. The principal sources for background radiation at the γ densitometers are 16 N activity from the primary piping H 2 O and γ radiation from reactor internal sources. The background radiation was calculated by the point-kernel codes QAD-BSA and QAD-P5A. Reasonable assumptions were required to convert the response functions calculated by point-kernel procedures into the gamma-ray spectrum from reactor internal sources. A brief summary of point-kernel equations and theory is included

  3. A definition of background independence

    International Nuclear Information System (INIS)

    Gryb, Sean

    2010-01-01

    We propose a definition for background (in)/dependence in dynamical theories of the evolution of configurations that have a continuous symmetry and test this definition on particle models and on gravity. Our definition draws from Barbour's best matching framework developed for the purpose of implementing spatial and temporal relationalism. Among other interesting theories, general relativity can be derived within this framework in novel ways. We study the detailed canonical structure of a wide range of best matching theories and show that their actions must have a local gauge symmetry. When gauge theory is derived in this way, we obtain at the same time a conceptual framework for distinguishing between background-dependent and -independent theories. Gauge invariant observables satisfying Kuchar's criterion are identified and, in simple cases, explicitly computed. We propose a procedure for inserting a global background time into temporally relational theories. Interestingly, using this procedure in general relativity leads to unimodular gravity.

  4. Generative electronic background music system

    Energy Technology Data Exchange (ETDEWEB)

    Mazurowski, Lukasz [Faculty of Computer Science, West Pomeranian University of Technology in Szczecin, Zolnierska Street 49, Szczecin, PL (Poland)

    2015-03-10

    In this short paper-extended abstract the new approach to generation of electronic background music has been presented. The Generative Electronic Background Music System (GEBMS) has been located between other related approaches within the musical algorithm positioning framework proposed by Woller et al. The music composition process is performed by a number of mini-models parameterized by further described properties. The mini-models generate fragments of musical patterns used in output composition. Musical pattern and output generation are controlled by container for the mini-models - a host-model. General mechanism has been presented including the example of the synthesized output compositions.

  5. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  6. Background music and cognitive performance.

    Science.gov (United States)

    Angel, Leslie A; Polzella, Donald J; Elvers, Greg C

    2010-06-01

    The present experiment employed standardized test batteries to assess the effects of fast-tempo music on cognitive performance among 56 male and female university students. A linguistic processing task and a spatial processing task were selected from the Criterion Task Set developed to assess verbal and nonverbal performance. Ten excerpts from Mozart's music matched for tempo were selected. Background music increased the speed of spatial processing and the accuracy of linguistic processing. The findings suggest that background music can have predictable effects on cognitive performance.

  7. Children of ethnic minority backgrounds

    DEFF Research Database (Denmark)

    Johansen, Stine Liv

    2010-01-01

    media products and toys just as they will have knowledge of different media texts, play genres, rhymes etc. This has consequences for their ability to access social settings, for instance in play. New research in this field will focus on how children themselves make sense of this balancing of cultures......Children of ethnic minority background balance their everyday life between a cultural background rooted in their ethnic origin and a daily life in day care, schools and with peers that is founded in a majority culture. This means, among other things, that they often will have access to different...

  8. Generative electronic background music system

    International Nuclear Information System (INIS)

    Mazurowski, Lukasz

    2015-01-01

    In this short paper-extended abstract the new approach to generation of electronic background music has been presented. The Generative Electronic Background Music System (GEBMS) has been located between other related approaches within the musical algorithm positioning framework proposed by Woller et al. The music composition process is performed by a number of mini-models parameterized by further described properties. The mini-models generate fragments of musical patterns used in output composition. Musical pattern and output generation are controlled by container for the mini-models - a host-model. General mechanism has been presented including the example of the synthesized output compositions

  9. Multiphoton amplitude in a constant background field

    Science.gov (United States)

    Ahmad, Aftab; Ahmadiniaz, Naser; Corradini, Olindo; Kim, Sang Pyo; Schubert, Christian

    2018-01-01

    In this contribution, we present our recent compact master formulas for the multiphoton amplitudes of a scalar propagator in a constant background field using the worldline fomulation of quantum field theory. The constant field has been included nonperturbatively, which is crucial for strong external fields. A possible application is the scattering of photons by electrons in a strong magnetic field, a process that has been a subject of great interest since the discovery of astrophysical objects like radio pulsars, which provide evidence that magnetic fields of the order of 1012G are present in nature. The presence of a strong external field leads to a strong deviation from the classical scattering amplitudes. We explicitly work out the Compton scattering amplitude in a magnetic field, which is a process of potential relevance for astrophysics. Our final result is compact and suitable for numerical integration.

  10. Numerical simulation of MHD flows in inhomogeneous and instationary magnetic fields

    International Nuclear Information System (INIS)

    Ehrhard, Sebastian

    2016-01-01

    In this work, I develop a numerical model for magnetohydrodynamic flows in unsteady an inhomogeneous flow. The model is implemented in the finite-volume based CFD-code OpenFOAM. Some verification and validation tests are made on several standard problems of magnetohydrodynamics. Finally I successful modelled an electromagnetic flowmeter with the code.

  11. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  12. Numerical methods using Matlab

    CERN Document Server

    Lindfield, George

    2012-01-01

    Numerical Methods using MATLAB, 3e, is an extensive reference offering hundreds of useful and important numerical algorithms that can be implemented into MATLAB for a graphical interpretation to help researchers analyze a particular outcome. Many worked examples are given together with exercises and solutions to illustrate how numerical methods can be used to study problems that have applications in the biosciences, chaos, optimization, engineering and science across the board. Numerical Methods using MATLAB, 3e, is an extensive reference offering hundreds of use

  13. Nonlinear Dynamics of the Cosmic Neutrino Background

    Science.gov (United States)

    Inman, Derek

    At least two of the three neutrino species are known to be massive, but their exact masses are currently unknown. Cosmic neutrinos decoupled from the rest of the primordial plasma early on when the Universe was over a billion times hotter than it is today. These relic particles, which have cooled and are now non-relativistic, constitute the Cosmic Neutrino Background and permeate the Universe. While they are not observable directly, their presence can be inferred by measuring the suppression of the matter power spectrum. This suppression is a linear effect caused by the large thermal velocities of neutrinos, which prevent them from collapsing gravitationally on small scales. Unfortunately, it is difficult to measure because of degeneracies with other cosmological parameters and biases arising from the fact that we typically observe point-like galaxies rather than a continous matter field. It is therefore important to look for new effects beyond linear suppression that may be more sensitive to neutrinos. This thesis contributes to the understanding of the nonlinear dynamics of the cosmological neutrino background in the following ways: (i) the development of a new injection scheme for neutrinos in cosmological N-body simulations which circumvents many issues associated with simulating neutrinos at large redshifts, (ii) the numerical study of the relative velocity field between cold dark matter and neutrinos including its reconstruction from density fields, (iii) the theoretical description of neutrinos as a dispersive fluid and its use in modelling the nonlinear evolution of the neutrino density power spectrum, (iv) the derivation of the dipole correlation function using linear response which allows for the Fermi-Dirac velocity distribution to be properly included, and (v) the numerical study and detection of the dipole correlation function in the TianNu simulation. In totality, this thesis is a comprehensive study of neutrino density and velocity fields that may

  14. Point splitting in a curved space-time background

    International Nuclear Information System (INIS)

    Liggatt, P.A.J.; Macfarlane, A.J.

    1979-01-01

    A prescription is given for point splitting in a curved space-time background which is a natural generalization of that familiar in quantum electrodynamics and Yang-Mills theory. It is applied (to establish its validity) to the verification of the gravitational anomaly in the divergence of a fermion axial current. Notable features of the prescription are that it defines a point-split current that can be differentiated straightforwardly, and that it involves a natural way of averaging (four-dimensionally) over the directions of point splitting. The method can extend directly from the spin-1/2 fermion case treated to other cases, e.g., to spin-3/2 Rarita-Schwinger fermions. (author)

  15. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Science.gov (United States)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  16. Low Background Micromegas in CAST

    DEFF Research Database (Denmark)

    Garza, J G; Aune, S.; Aznar, F.

    2014-01-01

    Solar axions could be converted into x-rays inside the strong magnetic field of an axion helioscope, triggering the detection of this elusive particle. Low background x-ray detectors are an essential component for the sensitivity of these searches. We report on the latest developments of the Micr...

  17. Teaching about Natural Background Radiation

    Science.gov (United States)

    Al-Azmi, Darwish; Karunakara, N.; Mustapha, Amidu O.

    2013-01-01

    Ambient gamma dose rates in air were measured at different locations (indoors and outdoors) to demonstrate the ubiquitous nature of natural background radiation in the environment and to show that levels vary from one location to another, depending on the underlying geology. The effect of a lead shield on a gamma radiation field was also…

  18. Educational Choice. A Background Paper.

    Science.gov (United States)

    Quality Education for Minorities Network, Washington, DC.

    This paper addresses school choice, one proposal to address parental involvement concerns, focusing on historical background, definitions, rationale for advocating choice, implementation strategies, and implications for minorities and low-income families. In the past, transfer payment programs such as tuition tax credits and vouchers were…

  19. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  20. Low Background Micromegas in CAST

    CERN Document Server

    Garza, J.G.; Aznar, F.; Calvet, D.; Castel, J.F.; Christensen, F.E.; Dafni, T.; Davenport, M.; Decker, T.; Ferrer-Ribas, E.; Galán, J.; García, J.A.; Giomataris, I.; Hill, R.M.; Iguaz, F.J.; Irastorza, I.G.; Jakobsen, A.C.; Jourde, D.; Mirallas, H.; Ortega, I.; Papaevangelou, T.; Pivovaroff, M.J.; Ruz, J.; Tomás, A.; Vafeiadis, T.; Vogel, J.K.

    2015-11-16

    Solar axions could be converted into x-rays inside the strong magnetic field of an axion helioscope, triggering the detection of this elusive particle. Low background x-ray detectors are an essential component for the sensitivity of these searches. We report on the latest developments of the Micromegas detectors for the CERN Axion Solar Telescope (CAST), including technological pathfinder activities for the future International Axion Observatory (IAXO). The use of low background techniques and the application of discrimination algorithms based on the high granularity of the readout have led to background levels below 10$^{-6}$ counts/keV/cm$^2$/s, more than a factor 100 lower than the first generation of Micromegas detectors. The best levels achieved at the Canfranc Underground Laboratory (LSC) are as low as 10$^{-7}$ counts/keV/cm$^2$/s, showing good prospects for the application of this technology in IAXO. The current background model, based on underground and surface measurements, is presented, as well as ...

  1. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  2. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  3. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  4. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  5. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  6. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  7. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  8. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  9. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  10. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  11. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  12. ESTRO ACROP guidelines for positioning, immobilisation and position verification of head and neck patients for radiation therapists

    Directory of Open Access Journals (Sweden)

    Michelle Leech

    2017-03-01

    Full Text Available Background and purpose: Over the last decade, the management of locally advanced head and neck cancers (HNCs has seen a substantial increase in the use of chemoradiation. These guidelines have been developed to assist Radiation TherapisTs (RTTs in positioning, immobilisation and position verification for head and neck cancer patients. Materials and methods: A critical review of the literature was undertaken by the writing committee.Based on the literature review, a survey was developed to ascertain the current positioning, immobilisation and position verification methods for head and neck radiation therapy across Europe. The survey was translated into Italian, German, Greek, Portuguese, Russian, Croatian, French and Spanish.Guidelines were subsequently developed by the writing committee. Results: Results from the survey indicated that a wide variety of treatment practices and treatment verification protocols are in operation for head and neck cancer patients across Europe currently.The guidelines developed are based on the experience and expertise of the writing committee, remaining cognisant of the variations in imaging and immobilisation techniques used currently in Europe. Conclusions: These guidelines have been developed to provide RTTs with guidance on positioning, immobilisation and position verification of HNC patients. The guidelines will also provide RTTs with the means to critically reflect on their own daily clinical practice with this patient group. Keywords: Head and neck, Immobilisation, Positioning, Verification

  13. The Cosmic Infrared Background Experiment

    Science.gov (United States)

    Bock, James; Battle, J.; Cooray, A.; Hristov, V.; Kawada, M.; Keating, B.; Lee, D.; Matsumoto, T.; Matsuura, S.; Nam, U.; Renbarger, T.; Sullivan, I.; Tsumura, K.; Wada, T.; Zemcov, M.

    2009-01-01

    We are developing the Cosmic Infrared Background ExpeRiment (CIBER) to search for signatures of first-light galaxy emission in the extragalactic background. The first generation of stars produce characteristic signatures in the near-infrared extragalactic background, including a redshifted Ly-cutoff feature and a characteristic fluctuation power spectrum, that may be detectable with a specialized instrument. CIBER consists of two wide-field cameras to measure the fluctuation power spectrum, and a low-resolution and a narrow-band spectrometer to measure the absolute background. The cameras will search for fluctuations on angular scales from 7 arcseconds to 2 degrees, where the first-light galaxy spatial power spectrum peaks. The cameras have the necessary combination of sensitivity, wide field of view, spatial resolution, and multiple bands to make a definitive measurement. CIBER will determine if the fluctuations reported by Spitzer arise from first-light galaxies. The cameras observe in a single wide field of view, eliminating systematic errors associated with mosaicing. Two bands are chosen to maximize the first-light signal contrast, at 1.6 um near the expected spectral maximum, and at 1.0 um; the combination is a powerful discriminant against fluctuations arising from local sources. We will observe regions of the sky surveyed by Spitzer and Akari. The low-resolution spectrometer will search for the redshifted Lyman cutoff feature in the 0.7 - 1.8 um spectral region. The narrow-band spectrometer will measure the absolute Zodiacal brightness using the scattered 854.2 nm Ca II Fraunhofer line. The spectrometers will test if reports of a diffuse extragalactic background in the 1 - 2 um band continues into the optical, or is caused by an under estimation of the Zodiacal foreground. We report performance of the assembled and tested instrument as we prepare for a first sounding rocket flight in early 2009. CIBER is funded by the NASA/APRA sub-orbital program.

  14. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  15. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  16. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  17. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  18. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  19. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  20. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  1. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  2. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  3. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  4. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  5. Detailed numerical simulations of laser cooling processes

    Science.gov (United States)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  6. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  7. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  8. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  9. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  10. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  11. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  12. Chameleon scalar fields in relativistic gravitational backgrounds

    International Nuclear Information System (INIS)

    Tsujikawa, Shinji; Tamaki, Takashi; Tavakol, Reza

    2009-01-01

    We study the field profile of a scalar field φ that couples to a matter fluid (dubbed a chameleon field) in the relativistic gravitational background of a spherically symmetric spacetime. Employing a linear expansion in terms of the gravitational potential Φ c at the surface of a compact object with a constant density, we derive the thin-shell field profile both inside and outside the object, as well as the resulting effective coupling with matter, analytically. We also carry out numerical simulations for the class of inverse power-law potentials V(φ) = M 4+n φ −n by employing the information provided by our analytical solutions to set the boundary conditions around the centre of the object and show that thin-shell solutions in fact exist if the gravitational potential Φ c is smaller than 0.3, which marginally covers the case of neutron stars. Thus the chameleon mechanism is present in the relativistic gravitational backgrounds, capable of reducing the effective coupling. Since thin-shell solutions are sensitive to the choice of boundary conditions, our analytic field profile is very helpful to provide appropriate boundary conditions for Φ c ∼< O(0.1)

  13. Chameleon scalar fields in relativistic gravitational backgrounds

    Energy Technology Data Exchange (ETDEWEB)

    Tsujikawa, Shinji [Department of Physics, Faculty of Science, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601 (Japan); Tamaki, Takashi [Department of Physics, Waseda University, Okubo 3-4-1, Tokyo 169-8555 (Japan); Tavakol, Reza, E-mail: shinji@rs.kagu.tus.ac.jp, E-mail: tamaki@gravity.phys.waseda.ac.jp, E-mail: r.tavakol@qmul.ac.uk [Astronomy Unit, School of Mathematical Sciences, Queen Mary University of London, London E1 4NS (United Kingdom)

    2009-05-15

    We study the field profile of a scalar field {phi} that couples to a matter fluid (dubbed a chameleon field) in the relativistic gravitational background of a spherically symmetric spacetime. Employing a linear expansion in terms of the gravitational potential {Phi}{sub c} at the surface of a compact object with a constant density, we derive the thin-shell field profile both inside and outside the object, as well as the resulting effective coupling with matter, analytically. We also carry out numerical simulations for the class of inverse power-law potentials V({phi}) = M{sup 4+n}{phi}{sup -n} by employing the information provided by our analytical solutions to set the boundary conditions around the centre of the object and show that thin-shell solutions in fact exist if the gravitational potential {Phi}{sub c} is smaller than 0.3, which marginally covers the case of neutron stars. Thus the chameleon mechanism is present in the relativistic gravitational backgrounds, capable of reducing the effective coupling. Since thin-shell solutions are sensitive to the choice of boundary conditions, our analytic field profile is very helpful to provide appropriate boundary conditions for {Phi}{sub c}{approx}

  14. Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  15. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  16. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. Background radioactivity in environmental materials

    International Nuclear Information System (INIS)

    Maul, P.R.; O'Hara, J.P.

    1989-01-01

    This paper presents the results of a literature search to identify information on concentrations of 'background' radioactivity in foodstuffs and other commonly available environmental materials. The review has concentrated on naturally occurring radioactivity in foods and on UK data, although results from other countries have also been considered where appropriate. The data are compared with established definitions of a 'radioactive' substance and radionuclides which do not appear to be adequately covered in the literature are noted. (author)

  18. Background paper on aquaculture research

    OpenAIRE

    Wenblad, Axel; Jokumsen, Alfred; Eskelinen, Unto; Torrissen, Ole

    2013-01-01

    The Board of MISTRA established in 2012 a Working Group (WG) on Aquaculture to provide the Board with background information for its upcoming decision on whether the foundation should invest in aquaculture research. The WG included Senior Advisor Axel Wenblad, Sweden (Chairman), Professor Ole Torrissen, Norway, Senior Advisory Scientist Unto Eskelinen, Finland and Senior Advisory Scientist Alfred Jokumsen, Denmark. The WG performed an investigation of the Swedish aquaculture sector including ...

  19. The isotropic radio background revisited

    Energy Technology Data Exchange (ETDEWEB)

    Fornengo, Nicolao; Regis, Marco [Dipartimento di Fisica Teorica, Università di Torino, via P. Giuria 1, I–10125 Torino (Italy); Lineros, Roberto A. [Instituto de Física Corpuscular – CSIC/U. Valencia, Parc Científic, calle Catedrático José Beltrán, 2, E-46980 Paterna (Spain); Taoso, Marco, E-mail: fornengo@to.infn.it, E-mail: rlineros@ific.uv.es, E-mail: regis@to.infn.it, E-mail: taoso@cea.fr [Institut de Physique Théorique, CEA/Saclay, F-91191 Gif-sur-Yvette Cédex (France)

    2014-04-01

    We present an extensive analysis on the determination of the isotropic radio background. We consider six different radio maps, ranging from 22 MHz to 2.3 GHz and covering a large fraction of the sky. The large scale emission is modeled as a linear combination of an isotropic component plus the Galactic synchrotron radiation and thermal bremsstrahlung. Point-like and extended sources are either masked or accounted for by means of a template. We find a robust estimate of the isotropic radio background, with limited scatter among different Galactic models. The level of the isotropic background lies significantly above the contribution obtained by integrating the number counts of observed extragalactic sources. Since the isotropic component dominates at high latitudes, thus making the profile of the total emission flat, a Galactic origin for such excess appears unlikely. We conclude that, unless a systematic offset is present in the maps, and provided that our current understanding of the Galactic synchrotron emission is reasonable, extragalactic sources well below the current experimental threshold seem to account for the majority of the brightness of the extragalactic radio sky.

  20. The isotropic radio background revisited

    International Nuclear Information System (INIS)

    Fornengo, Nicolao; Regis, Marco; Lineros, Roberto A.; Taoso, Marco

    2014-01-01

    We present an extensive analysis on the determination of the isotropic radio background. We consider six different radio maps, ranging from 22 MHz to 2.3 GHz and covering a large fraction of the sky. The large scale emission is modeled as a linear combination of an isotropic component plus the Galactic synchrotron radiation and thermal bremsstrahlung. Point-like and extended sources are either masked or accounted for by means of a template. We find a robust estimate of the isotropic radio background, with limited scatter among different Galactic models. The level of the isotropic background lies significantly above the contribution obtained by integrating the number counts of observed extragalactic sources. Since the isotropic component dominates at high latitudes, thus making the profile of the total emission flat, a Galactic origin for such excess appears unlikely. We conclude that, unless a systematic offset is present in the maps, and provided that our current understanding of the Galactic synchrotron emission is reasonable, extragalactic sources well below the current experimental threshold seem to account for the majority of the brightness of the extragalactic radio sky

  1. Backgrounds and characteristics of arsonists.

    Science.gov (United States)

    Labree, Wim; Nijman, Henk; van Marle, Hjalmar; Rassin, Eric

    2010-01-01

    The aim of this study was to gain more insight in the backgrounds and characteristics of arsonists. For this, the psychiatric, psychological, personal, and criminal backgrounds of all arsonists (n=25), sentenced to forced treatment in the maximum security forensic hospital "De Kijvelanden", were compared to the characteristics of a control group of patients (n=50), incarcerated at the same institution for other severe crimes. Apart from DSM-IV Axis I and Axis II disorders, family backgrounds, level of education, treatment history, intelligence (WAIS scores), and PCL-R scores were included in the comparisons. Furthermore, the apparent motives for the arson offences were explored. It was found that arsonists had more often received psychiatric treatment, prior to committing their index offence, and had a history of severe alcohol abuse more often in comparison to the controls. The arsonists turned out to be less likely to suffer from a major psychotic disorder. Both groups did not differ significantly on the other variables, among which the PCL-R total scores and factor scores. Exploratory analyses however, did suggest that arsonists may differentiate from non-arsonists on three items of the PCL-R, namely impulsivity (higher scores), superficial charm (lower scores), and juvenile delinquency (lower scores). Although the number of arsonists with a major psychotic disorder was relatively low (28%), delusional thinking of some form was judged to play a role in causing arson crimes in about half of the cases (52%).

  2. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  3. Verification on spray simulation of a pintle injector for liquid rocket engine

    Science.gov (United States)

    Son, Min; Yu, Kijeong; Radhakrishnan, Kanmaniraja; Shin, Bongchul; Koo, Jaye

    2016-02-01

    The pintle injector used for a liquid rocket engine is a newly re-attracted injection system famous for its wide throttle ability with high efficiency. The pintle injector has many variations with complex inner structures due to its moving parts. In order to study the rotating flow near the injector tip, which was observed from the cold flow experiment using water and air, a numerical simulation was adopted and a verification of the numerical model was later conducted. For the verification process, three types of experimental data including velocity distributions of gas flows, spray angles and liquid distribution were all compared using simulated results. The numerical simulation was performed using a commercial simulation program with the Eulerian multiphase model and axisymmetric two dimensional grids. The maximum and minimum velocities of gas were within the acceptable range of agreement, however, the spray angles experienced up to 25% error when the momentum ratios were increased. The spray density distributions were quantitatively measured and had good agreement. As a result of this study, it was concluded that the simulation method was properly constructed to study specific flow characteristics of the pintle injector despite having the limitations of two dimensional and coarse grids.

  4. The Bohm Criterion for Radiofrequency Discharges - a Numerical Verification Based on Poisson Equation

    NARCIS (Netherlands)

    Meijer, P. M.; W. J. Goedheer,

    1993-01-01

    Recently it was shown that, by using the analysis of electrostatic waves entering the plasma-sheath edge, the direct-current (dc) Bohm criterion also holds for discharges under radio-frequency (rf) conditions. In this paper, the influence of Bohm's criterion on the sheath characteristics for

  5. Application of 2DOF controller for reactor power control. Verification by numerical simulation

    International Nuclear Information System (INIS)

    Ishikawa, Nobuyuki; Suzuki, Katsuo

    1996-09-01

    In this report the usefulness of the two degree of freedom (2DOF) control is discussed to improve the reference response characteristics and robustness for reactor power control system. The 2DOF controller consists of feedforward and feedback elements. The feedforward element was designed by model matching method and the feedback element by solving the mixed sensitivity problem of H ∞ control. The 2DOF control gives good performance in both reference response and robustness to disturbance and plant perturbation. The simulation of reactor power control was performed by digitizing the 2DOF controller with the digital control periods of 10[msec]. It is found that the control period of 10[msec] is enough not to make degradation of the control performance by digitizing. (author)

  6. Numerical Verification of the Power Transfer and Wakefield Coupling in the CLIC Two-Beam Accelerator

    CERN Document Server

    Candel, Arno; NG, C; Rawat, V; Schussman, G; Ko, K; Syratchev, I; Grudiev, A; Wuensch, W

    2011-01-01

    The Compact Linear Collider (CLIC) provides a path to a multi-TeV accelerator to explore the energy frontier of High Energy Physics. Its two-beam accelerator (TBA) concept envisions complex 3D structures, which must be modeled to high accuracy so that simulation results can be directly used to prepare CAD drawings for machining. The required simulations include not only the fundamental mode properties of the accelerating structures but also the Power Extraction and Transfer Structure (PETS), as well as the coupling between the two systems. Time-domain simulations will be performed to understand pulse formation, wakefield damping, fundamental power transfer and wakefield coupling in these structures. Applying SLAC’s parallel finite element code suite, these large-scale problems will be solved on some of the largest supercomputers available. The results will help to identify potential issues and provide new insights on the design, leading to further improvements on the novel two-beam accelerator scheme.

  7. Numerical and experimental verification of a new model for fatique life

    International Nuclear Information System (INIS)

    Svensson, T.; Holmgren, M.

    1991-01-01

    A new model for fatigue life prediction has been investigated in this report. The model is based on Palmgren-Miners rule in combination with level crossing. Data from literature and experimental data generated in this project have been compared with fatigue life prediction made with the new model. The data have also been compared with traditional fatigue life estimations base on the rain flow count method. The fatigue life predicted with the new model often agree better with actual life than predictions made with the RFC-method. This is especially pronounced when the loading sequence is very irregular. The new method is both fast and simple to use. (au)

  8. Numerical simulation and experimental verification of oil recovery by macro-emulsion floods

    Energy Technology Data Exchange (ETDEWEB)

    Khamharatana, F. [Chulalongkorn Univ., Bangkok (Thailand); Thomas, S.; Farouq Ali, S. M. [Alberta Univ., Edmonton, AB (Canada)

    1997-08-01

    The process of emulsion flooding as an enhanced oil recovery method was described. The process involves several mechanisms that occur at the same time during displacement, therefore, simulation by emulsion flooding requires a good understanding of flow mechanics of emulsions in porous media. This paper provides a description of the process and its mathematical representation. Emulsion rheology, droplet capture and surfactant adsorption are represented mathematically and incorporated into a one-dimensional, three-phase mathematical model to account for interactions of surfactant, oil, water and the rock matrix. The simulator was validated by comparing simulation results with the results from linear core floods performed in the laboratory. Best match was achieved by a multi-phase non-Newtonian rheological model of an emulsion with interfacial tension-dependent relative permeabilities and time-dependent capture. 13 refs., 1 tab., 42 figs.

  9. Numerical distance protection

    CERN Document Server

    Ziegler, Gerhard

    2011-01-01

    Distance protection provides the basis for network protection in transmission systems and meshed distribution systems. This book covers the fundamentals of distance protection and the special features of numerical technology. The emphasis is placed on the application of numerical distance relays in distribution and transmission systems.This book is aimed at students and engineers who wish to familiarise themselves with the subject of power system protection, as well as the experienced user, entering the area of numerical distance protection. Furthermore it serves as a reference guide for s

  10. Numerical problems in physics

    CERN Document Server

    Singh, Devraj

    2015-01-01

    Numerical Problems in Physics, Volume 1 is intended to serve the need of the students pursuing graduate and post graduate courses in universities with Physics and Materials Science as subject including those appearing in engineering, medical, and civil services entrance examinations. KEY FEATURES: * 29 chapters on Optics, Wave & Oscillations, Electromagnetic Field Theory, Solid State Physics & Modern Physics * 540 solved numerical problems of various universities and ompetitive examinations * 523 multiple choice questions for quick and clear understanding of subject matter * 567 unsolved numerical problems for grasping concepts of the various topic in Physics * 49 Figures for understanding problems and concept

  11. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  12. Modeling and experimental verification of laser self-mixing interference phenomenon with the structure of two-external-cavity feedback

    Science.gov (United States)

    Chen, Peng; Liu, Yuwei; Gao, Bingkun; Jiang, Chunlei

    2018-03-01

    A semiconductor laser employed with two-external-cavity feedback structure for laser self-mixing interference (SMI) phenomenon is investigated and analyzed. The SMI model with two directions based on F-P cavity is deduced, and numerical simulation and experimental verification were conducted. Experimental results show that the SMI with the structure of two-external-cavity feedback under weak light feedback is similar to the sum of two SMIs.

  13. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  14. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  15. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  16. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  17. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  18. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  19. Remarks on numerical semigroups

    International Nuclear Information System (INIS)

    Torres, F.

    1995-12-01

    We extend results on Weierstrass semigroups at ramified points of double covering of curves to any numerical semigroup whose genus is large enough. As an application we strengthen the properties concerning Weierstrass weights state in [To]. (author). 25 refs

  20. Advances in Numerical Methods

    CERN Document Server

    Mastorakis, Nikos E

    2009-01-01

    Features contributions that are focused on significant aspects of current numerical methods and computational mathematics. This book carries chapters that advanced methods and various variations on known techniques that can solve difficult scientific problems efficiently.

  1. Introductory numerical analysis

    CERN Document Server

    Pettofrezzo, Anthony J

    2006-01-01

    Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.

  2. Introduction to numerical analysis

    CERN Document Server

    Hildebrand, F B

    1987-01-01

    Well-known, respected introduction, updated to integrate concepts and procedures associated with computers. Computation, approximation, interpolation, numerical differentiation and integration, smoothing of data, other topics in lucid presentation. Includes 150 additional problems in this edition. Bibliography.

  3. Family Background and Educational Choices

    DEFF Research Database (Denmark)

    McIntosh, James; D. Munk, Martin

    enrollments, especially for females. Not only did the educational opportunities for individuals with disadvantaged backgrounds improve absolutely, but their relative position also improved. A similarly dramatic increase in attendance at university for the period 1985-2005 was found for these cohorts when......We examine the participation in secondary and tertiary education of five cohorts of Danish males and females who were aged twenty starting in 1982 and ending in 2002. We find that the large expansion of secondary education in this period was characterized by a phenomenal increase in gymnasium...

  4. Numerical analysis of bifurcations

    International Nuclear Information System (INIS)

    Guckenheimer, J.

    1996-01-01

    This paper is a brief survey of numerical methods for computing bifurcations of generic families of dynamical systems. Emphasis is placed upon algorithms that reflect the structure of the underlying mathematical theory while retaining numerical efficiency. Significant improvements in the computational analysis of dynamical systems are to be expected from more reliance of geometric insight coming from dynamical systems theory. copyright 1996 American Institute of Physics

  5. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  6. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    Finfrock, S.H.

    1995-01-01

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  7. Background radiation map of Thailand

    International Nuclear Information System (INIS)

    Angsuwathana, P.; Chotikanatis, P.

    1997-01-01

    The radioelement concentration in the natural environment as well as the radiation exposure to man in day-to-day life is now the most interesting topic. The natural radiation is frequently referred as a standard for comparing additional sources of man-made radiation such as atomic weapon fallout, nuclear power generation, radioactive waste disposal, etc. The Department of Mineral Resources commenced a five-year project of nationwide airborne geophysical survey by awarding to Kenting Earth Sciences International Limited in 1984. The original purpose of survey was to support mineral exploration and geological mapping. Subsequently, the data quantity has been proved to be suitable for natural radiation information. In 1993 the Department of Mineral Resources, with the assistance of IAEA, published a Background Radiation Map of Thailand at the scale of 1:1,000,000 from the existing airborne radiometric digital data. The production of Background Radiation Map of Thailand is the result of data compilation and correction procedure developed over the Canadian Shield. This end product will be used as a base map in environmental application not only for Thailand but also Southeast Asia region. (author)

  8. [Cosmic Microwave Background (CMB) Anisotropies

    Science.gov (United States)

    Silk, Joseph

    1998-01-01

    One of the main areas of research is the theory of cosmic microwave background (CMB) anisotropies and analysis of CMB data. Using the four year COBE data we were able to improve existing constraints on global shear and vorticity. We found that, in the flat case (which allows for greatest anisotropy), (omega/H)0 less than 10(exp -7), where omega is the vorticity and H is the Hubble constant. This is two orders of magnitude lower than the tightest, previous constraint. We have defined a new set of statistics which quantify the amount of non-Gaussianity in small field cosmic microwave background maps. By looking at the distribution of power around rings in Fourier space, and at the correlations between adjacent rings, one can identify non-Gaussian features which are masked by large scale Gaussian fluctuations. This may be particularly useful for identifying unresolved localized sources and line-like discontinuities. Levin and collaborators devised a method to determine the global geometry of the universe through observations of patterns in the hot and cold spots of the CMB. We have derived properties of the peaks (maxima) of the CMB anisotropies expected in flat and open CDM models. We represent results for angular resolutions ranging from 5 arcmin to 20 arcmin (antenna FWHM), scales that are relevant for the MAP and COBRA/SAMBA space missions and the ground-based interferometer. Results related to galaxy formation and evolution are also discussed.

  9. Optical polarization: background and camouflage

    Science.gov (United States)

    Škerlind, Christina; Hallberg, Tomas; Eriksson, Johan; Kariis, Hans; Bergström, David

    2017-10-01

    Polarimetric imaging sensors in the electro-optical region, already military and commercially available in both the visual and infrared, show enhanced capabilities for advanced target detection and recognition. The capabilities arise due to the ability to discriminate between man-made and natural background surfaces using the polarization information of light. In the development of materials for signature management in the visible and infrared wavelength regions, different criteria need to be met to fulfil the requirements for a good camouflage against modern sensors. In conventional camouflage design, the aimed design of the surface properties of an object is to spectrally match or adapt it to a background and thereby minimizing the contrast given by a specific threat sensor. Examples will be shown from measurements of some relevant materials and how they in different ways affect the polarimetric signature. Dimensioning properties relevant in an optical camouflage from a polarimetric perspective, such as degree of polarization, the viewing or incident angle, and amount of diffuse reflection, mainly in the infrared region, will be discussed.

  10. The Cosmic Microwave Background Anisotropy

    Science.gov (United States)

    Bennett, C. L.

    1994-12-01

    The properties of the cosmic microwave background radiation provide unique constraints on the history and evolution of the universe. The first detection of anisotropy of the microwave radiation was reported by the COBE Team in 1992, based on the first year of flight data. The latest analyses of the first two years of COBE data are reviewed in this talk, including the amplitude of the microwave anisotropy as a function of angular scale and the statistical nature of the fluctuations. The two-year results are generally consistent with the earlier first year results, but the additional data allow for a better determination of the key cosmological parameters. In this talk the COBE results are compared with other observational anisotropy results and directions for future cosmic microwave anisotropy observations will be discussed. The National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC) is responsible for the design, development, and operation of the Cosmic Background Explorer (COBE). Scientific guidance is provided by the COBE Science Working Group.

  11. First Images from VLT Science Verification Programme

    Science.gov (United States)

    1998-09-01

    Two Weeks of Intensive Observations Successfully Concluded After a period of technical commissioning tests, the first 8.2-m telescope of the ESO VLT (UT1) has successfully performed an extensive series of "real science" observations , yielding nearly 100 hours of precious data. They concern all possible types of astronomical objects, from distant galaxies and quasars to pulsars, star clusters and solar system objects. This intensive Science Verification (SV) Programme took place as planned from August 17 to September 1, 1998, and was conducted by the ESO SV Team at the VLT Observatory on Paranal (Chile) and at the ESO Headquarters in Garching (Germany). The new giant telescope lived fully up to the high expectations and worked with spectacular efficiency and performance through the entire period. All data will be released by September 30 via the VLT archive and the web (with some access restrictions - see below). The Science Verification period Just before the beginning of the SV period, the 8.2-m primary mirror in its cell was temporarily removed in order to install the "M3 tower" with the tertiary mirror [1]. The reassembly began on August 15 and included re-installation at the Cassegrain focus of the VLT Test Camera that was also used for the "First Light" images in May 1998. After careful optical alignment and various system tests, the UT1 was handed over to the SV Team on August 17 at midnight local time. The first SV observations began immediately thereafter and the SV Team was active 24 hours a day throughout the two-week period. Video-conferences between Garching and Paranal took place every day at about noon Garching time (6 o'clock in the morning on Paranal). Then, while the Paranal observers were sleeping, data from the previous night were inspected and reduced in Garching, with feedback on what was best to do during the following night being emailed to Paranal several hours in advance of the beginning of the observations. The campaign ended in the

  12. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  13. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  14. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  15. The analytical and numerical study of the fluorination of uranium dioxide particles

    International Nuclear Information System (INIS)

    Sazhin, S.S.

    1997-01-01

    A detailed analytical study of the equations describing the fluorination of UO 2 particles is presented for some limiting cases assuming that the mass flowrate of these particles is so small that they do not affect the state of the gas. The analytical solutions obtained can be used for approximate estimates of the effect of fluorination on particle diameter and temperature but their major application, however, is probably in the verification of self-consistent numerical solutions. Computational results are presented and discussed for a self-consistent problem in which both the effects of gas on particles and particles on gas are accounted for. It has been shown that in the limiting cases for which analytical solutions have been obtained, the coincidence between numerical and analytical results is almost exact. This can be considered as a verification of both the analytical and numerical solutions. (orig.)

  16. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  17. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  18. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  19. Turf Conversion Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings as a result of water conservation measures (WCMs) in energy performance contracts associated with converting turfgrass or other water-intensive plantings to water-wise and sustainable landscapes. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  20. Outdoor Irrigation Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.

  1. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  2. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  3. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  4. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  5. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  6. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  7. Redistribution of energetic particles by background turbulence

    International Nuclear Information System (INIS)

    Hauff, T.; Jenko, F.

    2007-01-01

    The quest to understand the turbulent transport of particles, momentum and energy in magnetized plasmas remains a key challenge in fusion research. A basic issue being .still relatively poorly understood is the turbulent ExB advection of charged test particles with large gyroradii. Especially the interaction of alpha particles or impurities with the background turbulence is of great interest. In order to understand the dependence of the particle diffusivity on the interaction mechanisms between FLR effects and the special structure of a certain type of turbulence, direct numerical simulations are done in artificially created two dimensional turbulent electrostatic fields, assuming a constant magnetic field. Finite gyroradius effects are introduced using the gyrokinetic approximation which means that the gyrating particle is simply replaced by a charged ring. Starting from an idealized isotropic potential with Gaussian autocorrelation function, numerous test particle simulations are done varying both the gyroradius and the Kubo number of the potential. It is found that for Kubo numbers larger than about unity, the particle diffusivity is almost independent of the gyroradius as long as the latter does not exceed the correlation length of the electrostatic potential, whereas for small Kubo numbers the diffusivity is monotonically reduced. The underlying physical mechanisms of this behavior are identified and an analytic approach is developed which favorably agrees with the simulation results. The investigations are extended by introducing anisotropic structures like streamers and zonal flows into the artificial potential, leading to quantitative modulations of the gyroradius dependence of the diffusion coefficient. Analytic models are used to explain these various effects. After having developed a general overview on the behavior in simplified artificial potentials, test particle simulations in realistic turbulence created by the gyrokinetic turbulence code GENE are

  8. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  9. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk [Dept. of Mechanical Engineering, Chung Ang University, Seoul (Korea, Republic of)

    2016-10-15

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed.

  10. Experimental verification of active IR stealth technology by controlling the surface temperature using a thermoelectric element

    International Nuclear Information System (INIS)

    Kim, Dong Geon; Han, Kuk Il; Choi, Jun Hyuk; Kim, Tae Kuk

    2016-01-01

    In this paper, we propose a technique for IR low-observability that uses an active IR signal tuning through the real time control of the object surface temperature according to the varying background environment. This is achieved by applying the proper object surface temperature obtained to result in the minimum radiance difference between the object and the background. Experimental verification by using the thermoelectric temperature control element shows that the IR radiance contrast between the object and the background can be reduced up to 99% during the night and up to 95% during the day time as compared to the un-tuned original radiance contrast values. The stealth technology demonstrated in this paper may be applied for many military systems needed for the IR stealth performance when a suitable temperature control unit is developed

  11. Multicentre validation of IMRT pre-treatment verification: Comparison of in-house and external audit

    International Nuclear Information System (INIS)

    Jornet, Núria; Carrasco, Pablo; Beltrán, Mercè; Calvo, Juan Francisco; Escudé, Lluís; Hernández, Victor; Quera, Jaume; Sáez, Jordi

    2014-01-01

    Background and purpose: We performed a multicentre intercomparison of IMRT optimisation and dose planning and IMRT pre-treatment verification methods and results. The aims were to check consistency between dose plans and to validate whether in-house pre-treatment verification results agreed with those of an external audit. Materials and methods: Participating centres used two mock cases (prostate and head and neck) for the intercomparison and audit. Compliance to dosimetric goals and total number of MU per plan were collected. A simple quality index to compare the different plans was proposed. We compared gamma index pass rates using the centre’s equipment and methodology to those of an external audit. Results: While for the prostate case, all centres fulfilled the dosimetric goals and plan quality was homogeneous, that was not the case for the head and neck case. The number of MU did not correlate with the plan quality index. Pre-treatment verifications results of the external audit did not agree with those of the in-house measurements for two centres: being within tolerance for in-house measurements and unacceptable for the audit or the other way round. Conclusions: Although all plans fulfilled dosimetric constraints, plan quality is highly dependent on the planner expertise. External audits are an excellent tool to detect errors in IMRT implementation and cannot be replaced by intercomparison using results obtained by centres

  12. Low background aspects of GERDA

    International Nuclear Information System (INIS)

    Simgen, Hardy

    2011-01-01

    The GERDA experiment operates bare Germanium diodes enriched in 76 Ge in an environment of pure liquid argon to search for neutrinoless double beta decay. A very low radioactive background is essential for the success of the experiment. We present here the research done in order to remove radio-impurities coming from the liquid argon, the stainless steel cryostat and the front-end electronics. We found that liquid argon can be purified efficiently from 222 Rn. The main source of 222 Rn in GERDA is the cryostat which emanates about 55 mBq. A thin copper shroud in the center of the cryostat was implemented to prevent radon from approaching the diodes. Gamma ray screening of radio-pure components for front-end electronics resulted in the development of a pre-amplifier with a total activity of less than 1 mBq 228 Th.

  13. The cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Wilson, R.W.

    1980-01-01

    The history is described of the discovery of microwave radiation of the cosmic background using the 20-foot horn antenna at the Bell Laboratories back in 1965. Ruby masers with travelling wave were used, featuring the lowest noise in the world. The measurement proceeded on 7 cm. In measuring microwave radiation from the regions outside the Milky Way continuous noise was discovered whose temperature exceeded the calculated contributions of the individual detection system elements by 3 K. A comparison with the theory showed that relict radiation from the Big Bang period was the source of the noise. The discovery was verified by measurements on the 20.1 cm wavelength and by other authors' measurements on 0.5 mm to 74 cm, and by optical measurements of the interstellar molecule spectrum. (Ha)

  14. Polarization of Cosmic Microwave Background

    International Nuclear Information System (INIS)

    Buzzelli, A; Cabella, P; De Gasperis, G; Vittorio, N

    2016-01-01

    In this work we present an extension of the ROMA map-making code for data analysis of Cosmic Microwave Background polarization, with particular attention given to the inflationary polarization B-modes. The new algorithm takes into account a possible cross- correlated noise component among the different detectors of a CMB experiment. We tested the code on the observational data of the BOOMERanG (2003) experiment and we show that we are provided with a better estimate of the power spectra, in particular the error bars of the BB spectrum are smaller up to 20% for low multipoles. We point out the general validity of the new method. A possible future application is the LSPE balloon experiment, devoted to the observation of polarization at large angular scales. (paper)

  15. Weak Lensing by Galaxy Troughs in DES Science Verification Data

    Energy Technology Data Exchange (ETDEWEB)

    Gruen, D. [Ludwig Maximilian Univ., Munich (Germany); Max Planck Inst. for Extraterrestrial Physics, Garching (Germany). et al.

    2015-09-29

    We measure the weak lensing shear around galaxy troughs, i.e. the radial alignment of background galaxies relative to underdensities in projections of the foreground galaxy field over a wide range of redshift in Science Verification data from the Dark Energy Survey. Our detection of the shear signal is highly significant (10σ–15σ for the smallest angular scales) for troughs with the redshift range z ϵ [0.2, 0.5] of the projected galaxy field and angular diameters of 10 arcmin…1°. These measurements probe the connection between the galaxy, matter density, and convergence fields. By assuming galaxies are biased tracers of the matter density with Poissonian noise, we find agreement of our measurements with predictions in a fiducial Λ cold dark matter model. Furthermore, the prediction for the lensing signal on large trough scales is virtually independent of the details of the underlying model for the connection of galaxies and matter. Our comparison of the shear around troughs with that around cylinders with large galaxy counts is consistent with a symmetry between galaxy and matter over- and underdensities. In addition, we measure the two-point angular correlation of troughs with galaxies which, in contrast to the lensing signal, is sensitive to galaxy bias on all scales. Finally, the lensing signal of troughs and their clustering with galaxies is therefore a promising probe of the statistical properties of matter underdensities and their connection to the galaxy field.

  16. Determining the Accuracy of Crowdsourced Tweet Verification for Auroral Research

    Directory of Open Access Journals (Sweden)

    Nathan A. Case

    2016-12-01

    Full Text Available The Aurorasaurus project harnesses volunteer crowdsourcing to identify sightings of an aurora (the “northern/southern lights” posted by citizen scientists on Twitter. Previous studies have demonstrated that aurora sightings can be mined from Twitter with the caveat that there is a large background level of non-sighting tweets, especially during periods of low auroral activity. Aurorasaurus attempts to mitigate this, and thus increase the quality of its Twitter sighting data, by using volunteers to sift through a pre-filtered list of geolocated tweets to verify real-time aurora sightings. In this study, the current implementation of this crowdsourced verification system, including the process of geolocating tweets, is described and its accuracy (which, overall, is found to be 68.4% is determined. The findings suggest that citizen science volunteers are able to accurately filter out unrelated, spam-like, Twitter data but struggle when filtering out somewhat related, yet undesired, data. The citizen scientists particularly struggle with determining the real-time nature of the sightings, so care must be taken when relying on crowdsourced identification.

  17. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  18. Measurements of Worldwide Radioxenon Backgrounds - The "EU" Project

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, Ted W.; Cooper, Matthew W.; Hayes, James C.; Forrester, Joel B.; Haas, Derek A.; Hansen, Randy R.; Keller, Paul E.; Kirkham, Randy R.; Lidey, Lance S.; McIntyre, Justin I.; Miley, Harry S.; Payne, Rosara F.; Saey, Paul R.; Thompson, Robert C.; Woods, Vincent T.; Williams, Richard M.

    2009-09-24

    Under the Comprehensive Nuclear-Test-Ban Treaty (CTBT), radioactive xenon (radioxenon) measurements are one of the principle techniques used to detect nuclear underground nuclear explosions, and specifically, the presence of one or more radioxenon isotopes allows one to determine whether a suspected event was a nuclear explosion or originated from an innocent source. During the design of the International Monitoring System (IMS), which was designed as the verification mechanism for the Treaty, it was determined that radioxenon measurements should be performed at 40 or more stations worldwide. At the time of the design of the IMS, however, very few details about the background of the xenon isotopes was known and it is now recognized that the backgrounds were probably evolving anyhow. This paper lays out the beginning of a study of the worldwide concentrations of xenon isotopes that can be used to detect nuclear explosions and several sources that also release radioxenons, and will have to be accounted for during analysis of atmospheric levels. Although the global concentrations of the xenon isotopes are the scope of a much larger activity that could span over several years, this study measures radioxenon concentrations in locations where there was either very little information or there was a unique opportunity to learn more about emissions from known sources. The locations where radioxenon levels were measured and reported are included.

  19. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  20. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.