WorldWideScience

Sample records for carlo verification system

  1. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  2. Monte Carlo systems used for treatment planning and dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo

  3. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, Y [Tokai University School of Medicine, Isehara, Kanagawa (Japan)

    2015-06-15

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057.

  4. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    International Nuclear Information System (INIS)

    Fujita, Y

    2015-01-01

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057

  5. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    Science.gov (United States)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  6. Verification of Monte Carlo transport codes by activation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera

    2012-12-18

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  7. Verification of Monte Carlo transport codes by activation experiments

    OpenAIRE

    Chetvertkova, Vera

    2013-01-01

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is...

  8. Monte Carlo investigation of collapsed versus rotated IMRT plan verification.

    Science.gov (United States)

    Conneely, Elaine; Alexander, Andrew; Ruo, Russell; Chung, Eunah; Seuntjens, Jan; Foley, Mark J

    2014-05-08

    IMRT QA requires, among other tests, a time-consuming process of measuring the absorbed dose, at least to a point, in a high-dose, low-dose-gradient region. Some clinics use a technique of measuring this dose with all beams delivered at a single gantry angle (collapsed delivery), as opposed to the beams delivered at the planned gantry angle (rotated delivery). We examined, established, and optimized Monte Carlo simulations of the dosimetry for IMRT verification of treatment plans for these two different delivery modes (collapsed versus rotated). The results of the simulations were compared to the treatment planning system dose calculations for the two delivery modes, as well as to measurements taken. This was done in order to investigate the validity of the use of a collapsed delivery technique for IMRT QA. The BEAMnrc, DOSXYZnrc, and egs_chamber codes were utilized for the Monte Carlo simulations along with the MMCTP system. A number of different plan complexity metrics were also used in the analysis of the dose distributions in a bid to qualify why verification in a collapsed delivery may or may not be optimal for IMRT QA. Following the Alfonso et al. formalism, the kfclin,frefQclin,Q correction factor was calculated to correct the deviation of small fields from the reference conditions used for beam calibration. We report on the results obtained for a cohort of 20 patients. The plan complexity was investigated for each plan using the complexity metrics of homogeneity index, conformity index, modulation complexity score, and the fraction of beams from a particular plan that intersect the chamber when performing the QA. Rotated QA gives more consistent results than the collapsed QA technique. The kfclin,frefQclin,Qfactor deviates less from 1 for rotated QA than for collapsed QA. If the homogeneity index is less than 0.05 then the kfclin,frefQclin,Q factor does not deviate from unity by more than 1%. A value this low for the homogeneity index can only be obtained

  9. Monte Carlo simulations to replace film dosimetry in IMRT verification

    International Nuclear Information System (INIS)

    Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)

  10. Simulation of digital pixel readout chip architectures with the RD53 SystemVerilog-UVM verification environment using Monte Carlo physics data

    International Nuclear Information System (INIS)

    Conti, E.; Marconi, S.; Christiansen, J.; Placidi, P.; Hemperek, T.

    2016-01-01

    The simulation and verification framework developed by the RD53 collaboration is a powerful tool for global architecture optimization and design verification of next generation hybrid pixel readout chips. In this paper the framework is used for studying digital pixel chip architectures at behavioral level. This is carried out by simulating a dedicated, highly parameterized pixel chip description, which makes it possible to investigate different grouping strategies between pixels and different latency buffering and arbitration schemes. The pixel hit information used as simulation input can be either generated internally in the framework or imported from external Monte Carlo detector simulation data. The latter have been provided by both the CMS and ATLAS experiments, featuring HL-LHC operating conditions and the specifications related to the Phase 2 upgrade. Pixel regions and double columns were simulated using such Monte Carlo data as inputs: the performance of different latency buffering architectures was compared and the compliance of different link speeds with the expected column data rate was verified

  11. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    Science.gov (United States)

    Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.

    2014-06-01

    A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.

  12. A virtual linear accelerator for verification of treatment planning systems

    International Nuclear Information System (INIS)

    Wieslander, Elinore

    2000-01-01

    A virtual linear accelerator is implemented into a commercial pencil-beam-based treatment planning system (TPS) with the purpose of investigating the possibility of verifying the system using a Monte Carlo method. The characterization set for the TPS includes depth doses, profiles and output factors, which is generated by Monte Carlo simulations. The advantage of this method over conventional measurements is that variations in accelerator output are eliminated and more complicated geometries can be used to study the performance of a TPS. The difference between Monte Carlo simulated and TPS calculated profiles and depth doses in the characterization geometry is less than ±2% except for the build-up region. This is of the same order as previously reported results based on measurements. In an inhomogeneous, mediastinum-like case, the deviations between TPS and simulations are small in the unit-density regions. In low-density regions, the TPS overestimates the dose, and the overestimation increases with increasing energy from 3.5% for 6 MV to 9.5% for 18 MV. This result points out the widely known fact that the pencil beam concept does not handle changes in lateral electron transport, nor changes in scatter due to lateral inhomogeneities. It is concluded that verification of a pencil-beam-based TPS with a Monte Carlo based virtual accelerator is possible, which facilitates the verification procedure. (author)

  13. Verification of the VEF photon beam model for dose calculations by the voxel-Monte-Carlo-algorithm

    International Nuclear Information System (INIS)

    Kriesen, S.; Fippel, M.

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tuebingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning. (orig.)

  14. [Verification of the VEF photon beam model for dose calculations by the Voxel-Monte-Carlo-Algorithm].

    Science.gov (United States)

    Kriesen, Stephan; Fippel, Matthias

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tübingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning.

  15. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    Science.gov (United States)

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  16. EDITORIAL: International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification

    Science.gov (United States)

    Verhaegen, Frank; Seuntjens, Jan

    2008-03-01

    Monte Carlo particle transport techniques offer exciting tools for radiotherapy research, where they play an increasingly important role. Topics of research related to clinical applications range from treatment planning, motion and registration studies, brachytherapy, verification imaging and dosimetry. The International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification took place in a hotel in Montreal in French Canada, from 29 May-1 June 2007, and was the third workshop to be held on a related topic, which now seems to have become a tri-annual event. About one hundred workers from many different countries participated in the four-day meeting. Seventeen experts in the field were invited to review topics and present their latest work. About half of the audience was made up by young graduate students. In a very full program, 57 papers were presented and 10 posters were on display during most of the meeting. On the evening of the third day a boat trip around the island of Montreal allowed participants to enjoy the city views, and to sample the local cuisine. The topics covered at the workshop included the latest developments in the most popular Monte Carlo transport algorithms, fast Monte Carlo, statistical issues, source modeling, MC treatment planning, modeling of imaging devices for treatment verification, registration and deformation of images and a sizeable number of contributions on brachytherapy. In this volume you will find 27 short papers resulting from the workshop on a variety of topics, some of them on very new stuff such as graphics processing units for fast computing, PET modeling, dual-energy CT, calculations in dynamic phantoms, tomotherapy devices, . . . . We acknowledge the financial support of the National Cancer Institute of Canada, the Institute of Cancer Research of the Canadian Institutes of Health Research, the Association Québécoise des Physicien(ne)s Médicaux Clinique, the Institute of Physics, and Medical

  17. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  18. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  19. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  20. SU-F-T-575: Verification of a Monte-Carlo Small Field SRS/SBRT Dose Calculation System

    International Nuclear Information System (INIS)

    Sudhyadhom, A; McGuinness, C; Descovich, M

    2016-01-01

    Purpose: To develop a methodology for validation of a Monte-Carlo dose calculation model for robotic small field SRS/SBRT deliveries. Methods: In a robotic treatment planning system, a Monte-Carlo model was iteratively optimized to match with beam data. A two-part analysis was developed to verify this model. 1) The Monte-Carlo model was validated in a simulated water phantom versus a Ray-Tracing calculation on a single beam collimator-by-collimator calculation. 2) The Monte-Carlo model was validated to be accurate in the most challenging situation, lung, by acquiring in-phantom measurements. A plan was created and delivered in a CIRS lung phantom with film insert. Separately, plans were delivered in an in-house created lung phantom with a PinPoint chamber insert within a lung simulating material. For medium to large collimator sizes, a single beam was delivered to the phantom. For small size collimators (10, 12.5, and 15mm), a robotically delivered plan was created to generate a uniform dose field of irradiation over a 2×2cm 2 area. Results: Dose differences in simulated water between Ray-Tracing and Monte-Carlo were all within 1% at dmax and deeper. Maximum dose differences occurred prior to dmax but were all within 3%. Film measurements in a lung phantom show high correspondence of over 95% gamma at the 2%/2mm level for Monte-Carlo. Ion chamber measurements for collimator sizes of 12.5mm and above were within 3% of Monte-Carlo calculated values. Uniform irradiation involving the 10mm collimator resulted in a dose difference of ∼8% for both Monte-Carlo and Ray-Tracing indicating that there may be limitations with the dose calculation. Conclusion: We have developed a methodology to validate a Monte-Carlo model by verifying that it matches in water and, separately, that it corresponds well in lung simulating materials. The Monte-Carlo model and algorithm tested may have more limited accuracy for 10mm fields and smaller.

  1. Verification of IMRT dose distributions using a water beam imaging system

    International Nuclear Information System (INIS)

    Li, J.S.; Boyer, Arthur L.; Ma, C.-M.

    2001-01-01

    A water beam imaging system (WBIS) has been developed and used to verify dose distributions for intensity modulated radiotherapy using dynamic multileaf collimator. This system consisted of a water container, a scintillator screen, a charge-coupled device camera, and a portable personal computer. The scintillation image was captured by the camera. The pixel value in this image indicated the dose value in the scintillation screen. Images of radiation fields of known spatial distributions were used to calibrate the device. The verification was performed by comparing the image acquired from the measurement with a dose distribution from the IMRT plan. Because of light scattering in the scintillator screen, the image was blurred. A correction for this was developed by recognizing that the blur function could be fitted to a multiple Gaussian. The blur function was computed using the measured image of a 10 cmx10 cm x-ray beam and the result of the dose distribution calculated using the Monte Carlo method. Based on the blur function derived using this method, an iterative reconstruction algorithm was applied to recover the dose distribution for an IMRT plan from the measured WBIS image. The reconstructed dose distribution was compared with Monte Carlo simulation result. Reasonable agreement was obtained from the comparison. The proposed approach makes it possible to carry out a real-time comparison of the dose distribution in a transverse plane between the measurement and the reference when we do an IMRT dose verification

  2. Generation and Verification of ENDF/B-VII.0 Cross section Libraries for Monte Carlo Calculations

    International Nuclear Information System (INIS)

    Park, Ho Jin; Kwak, Min Su; Joo, Han Gyu; Kim, Chang Hyo

    2007-01-01

    For Monte Carlo neutronics calculations, a continuous energy nuclear data library is needed. It can be generated from various evaluated nuclear data files such as ENDF/B using the ACER routine of the NJOY.code after a series of prior processing involving various other NJOY routines. Recently, a utility code, which generates the NJOY input decks in an automated mode, named ANJOYMC became available. The use of this code greatly reduces the user's effort and the possibility of input errors. In December 2006, the initial version of the ENDF/BVII nuclear data library was released. It was reported that the new data files have much better data which reduces the errors noted in the previous versions. Thus it is worthwhile to examine the performance of the new data files particularly using an independent Monte Carlo code, MCCARD and the ANJOYMC utility code. The verification of the newly generated library can be readily performed by analyzing numerous standard criticality benchmark problems

  3. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  4. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  5. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    Linares Rosales, H. M.; Alfonso Laguardia, R.; Lara Mas, E.; Popescu, T.

    2015-01-01

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  6. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  7. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  8. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  9. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  10. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  11. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  12. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  13. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  14. Monte Carlo simulation and experimental verification of radiotherapy electron beams

    International Nuclear Information System (INIS)

    Griffin, J.; Deloar, H. M.

    2007-01-01

    Full text: Based on fundamental physics and statistics, the Monte Carlo technique is generally accepted as the accurate method for modelling radiation therapy treatments. A Monte Carlo simulation system has been installed, and models of linear accelerators in the more commonly used electron beam modes have been built and commissioned. A novel technique for radiation dosimetry is also being investigated. Combining the advantages of both water tank and solid phantom dosimetry, a hollow, thin walled shell or mask is filled with water and then raised above the natural water surface to produce a volume of water with the desired irregular shape.

  15. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  16. Verification of Monte Carlo calculations of the neutron flux in the carousel channels of the TRIGA Mark II reactor, Ljubljana

    International Nuclear Information System (INIS)

    Jacimovic, R.; Maucec, M.; Trkov, A.

    2002-01-01

    In this work experimental verification of Monte Carlo neutron flux calculations in the carousel facility (CF) of the 250 kW TRIGA Mark II reactor at the Jozef Stefan Institute is presented. Simulations were carried out using the Monte Carlo radiation-transport code, MCNP4B. The objective of the work was to model and verify experimentally the azimuthal variation of neutron flux in the CF for core No. 176, set up in April 2002. '1'9'8Au activities of Al-Au(0.1%) disks irradiated in 11 channels of the CF covering 180'0 around the perimeter of the core were measured. The comparison between MCNP calculation and measurement shows relatively good agreement and demonstrates the overall accuracy with which the detailed spectral characteristics can be predicted by calculations.(author)

  17. Monte Carlo modeling of neutron and gamma-ray imaging systems

    International Nuclear Information System (INIS)

    Hall, J.

    1996-04-01

    Detailed numerical prototypes are essential to design of efficient and cost-effective neutron and gamma-ray imaging systems. We have exploited the unique capabilities of an LLNL-developed radiation transport code (COG) to develop code modules capable of simulating the performance of neutron and gamma-ray imaging systems over a wide range of source energies. COG allows us to simulate complex, energy-, angle-, and time-dependent radiation sources, model 3-dimensional system geometries with ''real world'' complexity, specify detailed elemental and isotopic distributions and predict the responses of various types of imaging detectors with full Monte Carlo accuray. COG references detailed, evaluated nuclear interaction databases allowingusers to account for multiple scattering, energy straggling, and secondary particle production phenomena which may significantly effect the performance of an imaging system by may be difficult or even impossible to estimate using simple analytical models. This work presents examples illustrating the use of these routines in the analysis of industrial radiographic systems for thick target inspection, nonintrusive luggage and cargoscanning systems, and international treaty verification

  18. SU-F-T-619: Dose Evaluation of Specific Patient Plans Based On Monte Carlo Algorithm for a CyberKnife Stereotactic Radiosurgery System

    Energy Technology Data Exchange (ETDEWEB)

    Piao, J [PLA General Hospital, Beijing (China); PLA 302 Hospital, Beijing (China); Xu, S [PLA General Hospital, Beijing (China); Tsinghua University, Beijing (China); Wu, Z; Liu, Y [Tsinghua University, Beijing (China); Li, Y [Beihang University, Beijing (China); Qu, B [PLA General Hospital, Beijing (China); Duan, X [PLA 302 Hospital, Beijing (China)

    2016-06-15

    Purpose: This study will use Monte Carlo to simulate the Cyberknife system, and intend to develop the third-party tool to evaluate the dose verification of specific patient plans in TPS. Methods: By simulating the treatment head using the BEAMnrc and DOSXYZnrc software, the comparison between the calculated and measured data will be done to determine the beam parameters. The dose distribution calculated in the Raytracing, Monte Carlo algorithms of TPS (Multiplan Ver4.0.2) and in-house Monte Carlo simulation method for 30 patient plans, which included 10 head, lung and liver cases in each, were analyzed. The γ analysis with the combined 3mm/3% criteria would be introduced to quantitatively evaluate the difference of the accuracy between three algorithms. Results: More than 90% of the global error points were less than 2% for the comparison of the PDD and OAR curves after determining the mean energy and FWHM.The relative ideal Monte Carlo beam model had been established. Based on the quantitative evaluation of dose accuracy for three algorithms, the results of γ analysis shows that the passing rates (84.88±9.67% for head,98.83±1.05% for liver,98.26±1.87% for lung) of PTV in 30 plans between Monte Carlo simulation and TPS Monte Carlo algorithms were good. And the passing rates (95.93±3.12%,99.84±0.33% in each) of PTV in head and liver plans between Monte Carlo simulation and TPS Ray-tracing algorithms were also good. But the difference of DVHs in lung plans between Monte Carlo simulation and Ray-tracing algorithms was obvious, and the passing rate (51.263±38.964%) of γ criteria was not good. It is feasible that Monte Carlo simulation was used for verifying the dose distribution of patient plans. Conclusion: Monte Carlo simulation algorithm developed in the CyberKnife system of this study can be used as a reference tool for the third-party tool, which plays an important role in dose verification of patient plans. This work was supported in part by the grant

  19. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  20. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  1. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  2. Experimental verification of lung dose with radiochromic film: comparison with Monte Carlo simulations and commercially available treatment planning systems

    International Nuclear Information System (INIS)

    Paelinck, L; Reynaert, N; Thierens, H; Neve, W De; Wagter, C de

    2005-01-01

    The purpose of this study was to assess the absorbed dose in and around lung tissue by performing radiochromic film measurements, Monte Carlo simulations and calculations with superposition convolution algorithms. We considered a layered polystyrene phantom of 12 x 12 x 12 cm 3 containing a central cavity of 6 x 6 x 6 cm 3 filled with Gammex RMI lung-equivalent material. Two field configurations were investigated, a small 1 x 10 cm 2 field and a larger 10 x 10 cm 2 field. First, we performed Monte Carlo simulations to investigate the influence of radiochromic film itself on the measured dose distribution when the film intersects a lung-equivalent region and is oriented parallel to the central beam axis. To that end, the film and the lung-equivalent materials were modelled in detail, taking into account their specific composition. Next, measurements were performed with the film oriented both parallel and perpendicular to the central beam axis to verify the results of our Monte Carlo simulations. Finally, we digitized the phantom in two commercially available treatment planning systems, Helax-TMS version 6.1A and Pinnacle version 6.2b, and calculated the absorbed dose in the phantom with their incorporated superposition convolution algorithms to compare with the Monte Carlo simulations. Comparing Monte Carlo simulations with measurements reveals that radiochromic film is a reliable dosimeter in and around lung-equivalent regions when the film is positioned perpendicular to the central beam axis. Radiochromic film is also able to predict the absorbed dose accurately when the film is positioned parallel to the central beam axis through the lung-equivalent region. However, attention must be paid when the film is not positioned along the central beam axis, in which case the film gradually attenuates the beam and decreases the dose measured behind the cavity. This underdosage disappears by offsetting the film a few centimetres. We find deviations of about 3.6% between

  3. Experimental verification of lung dose with radiochromic film: comparison with Monte Carlo simulations and commercially available treatment planning systems

    Science.gov (United States)

    Paelinck, L.; Reynaert, N.; Thierens, H.; DeNeve, W.; DeWagter, C.

    2005-05-01

    The purpose of this study was to assess the absorbed dose in and around lung tissue by performing radiochromic film measurements, Monte Carlo simulations and calculations with superposition convolution algorithms. We considered a layered polystyrene phantom of 12 × 12 × 12 cm3 containing a central cavity of 6 × 6 × 6 cm3 filled with Gammex RMI lung-equivalent material. Two field configurations were investigated, a small 1 × 10 cm2 field and a larger 10 × 10 cm2 field. First, we performed Monte Carlo simulations to investigate the influence of radiochromic film itself on the measured dose distribution when the film intersects a lung-equivalent region and is oriented parallel to the central beam axis. To that end, the film and the lung-equivalent materials were modelled in detail, taking into account their specific composition. Next, measurements were performed with the film oriented both parallel and perpendicular to the central beam axis to verify the results of our Monte Carlo simulations. Finally, we digitized the phantom in two commercially available treatment planning systems, Helax-TMS version 6.1A and Pinnacle version 6.2b, and calculated the absorbed dose in the phantom with their incorporated superposition convolution algorithms to compare with the Monte Carlo simulations. Comparing Monte Carlo simulations with measurements reveals that radiochromic film is a reliable dosimeter in and around lung-equivalent regions when the film is positioned perpendicular to the central beam axis. Radiochromic film is also able to predict the absorbed dose accurately when the film is positioned parallel to the central beam axis through the lung-equivalent region. However, attention must be paid when the film is not positioned along the central beam axis, in which case the film gradually attenuates the beam and decreases the dose measured behind the cavity. This underdosage disappears by offsetting the film a few centimetres. We find deviations of about 3.6% between

  4. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  5. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  6. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  7. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  8. Experimental study and by Monte Carlo of a prototype of hodoscopic of fibre optics for high resolution applications; Estudio experimental y por Monte Carlo de un prototipo de hodoscopio de fibras opticas para aplicaciones de alta resolucion

    Energy Technology Data Exchange (ETDEWEB)

    Granero, D.; Blasco, J. M.; Sanchis, E.; Gonzalez, V.; Martin, J. D.; Ballester, F.; Sanchis, E.

    2013-07-01

    The purpose of this work is to test the response of a system composed of 21 scintillators radiation fibres and its electronics as proof of the validity of the System. For this it has radiated test system with a source of verification of Sr-90. In addition, performed Monte Carlo simulations of the system by comparing the results of the simulations with those obtained experimentally. Moreover taken an approximation to the behavior of a hodoscopic composed of 100 scintillators, transverse fibers between if, in proton therapy, conducting different Monte Carlo simulations. (Author)

  9. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  10. Verification of three dimensional triangular prismatic discrete ordinates transport code ENSEMBLE-TRIZ by comparison with Monte Carlo code GMVP

    International Nuclear Information System (INIS)

    Homma, Y.; Moriwaki, H.; Ikeda, K.; Ohdi, S.

    2013-01-01

    This paper deals with the verification of the 3 dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with the multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at the beginning of cycle of an initial core and at the beginning and the end of cycle of an equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multiplication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity. (authors)

  11. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  12. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  13. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  14. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  15. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  16. Experimental study and by Monte Carlo of a prototype of hodoscopic of fibre optics for high resolution applications

    International Nuclear Information System (INIS)

    Granero, D.; Blasco, J. M.; Sanchis, E.; Gonzalez, V.; Martin, J. D.; Ballester, F.; Sanchis, E.

    2013-01-01

    The purpose of this work is to test the response of a system composed of 21 scintillators radiation fibres and its electronics as proof of the validity of the System. For this it has radiated test system with a source of verification of Sr-90. In addition, performed Monte Carlo simulations of the system by comparing the results of the simulations with those obtained experimentally. Moreover taken an approximation to the behavior of a hodoscopic composed of 100 scintillators, transverse fibers between if, in proton therapy, conducting different Monte Carlo simulations. (Author)

  17. Time-resolved diode dosimetry calibration through Monte Carlo modeling for in vivo passive scattered proton therapy range verification.

    Science.gov (United States)

    Toltz, Allison; Hoesl, Michaela; Schuemann, Jan; Seuntjens, Jan; Lu, Hsiao-Ming; Paganetti, Harald

    2017-11-01

    Our group previously introduced an in vivo proton range verification methodology in which a silicon diode array system is used to correlate the dose rate profile per range modulation wheel cycle of the detector signal to the water-equivalent path length (WEPL) for passively scattered proton beam delivery. The implementation of this system requires a set of calibration data to establish a beam-specific response to WEPL fit for the selected 'scout' beam (a 1 cm overshoot of the predicted detector depth with a dose of 4 cGy) in water-equivalent plastic. This necessitates a separate set of measurements for every 'scout' beam that may be appropriate to the clinical case. The current study demonstrates the use of Monte Carlo simulations for calibration of the time-resolved diode dosimetry technique. Measurements for three 'scout' beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). The 'scout' beams were then applied in the simulation environment to simulated water-equivalent plastic, a CT of water-equivalent plastic, and a patient CT data set to assess uncertainty. Simulated detector response in water-equivalent plastic was validated against measurements for 'scout' spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) to within 3.4 mm for all beams, and to within 1 mm in the region where the detector is expected to lie. Feasibility has been shown for performing the calibration of the detector response for three 'scout' beams through simulation for the time-resolved diode dosimetry technique in passive scattered proton delivery. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  18. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  19. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  20. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...... and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...

  1. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  2. Evaluation of tomographic-image based geometries with PENELOPE Monte Carlo

    International Nuclear Information System (INIS)

    Kakoi, A.A.Y.; Galina, A.C.; Nicolucci, P.

    2009-01-01

    The Monte Carlo method can be used to evaluate treatment planning systems or for the determination of dose distributions in radiotherapy planning due to its accuracy and precision. In Monte Carlo simulation packages typically used in radiotherapy, however, a realistic representation of the geometry of the patient can not be used, which compromises the accuracy of the results. In this work, an algorithm for the description of geometries based on CT images of patients, developed to be used with Monte Carlo simulation package PENELOPE, is tested by simulating the dose distribution produced by a photon beam of 10 MV. The geometry simulated was based on CT images of a planning of prostate cancer. The volumes of interest in the treatment were adequately represented in the simulation geometry, allowing the algorithm to be used in verification of doses in radiotherapy treatments. (author)

  3. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  4. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  5. MCOR - Monte Carlo depletion code for reference LWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)

    2011-04-15

    Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally

  6. MCOR - Monte Carlo depletion code for reference LWR calculations

    International Nuclear Information System (INIS)

    Puente Espel, Federico; Tippayakul, Chanatip; Ivanov, Kostadin; Misu, Stefan

    2011-01-01

    Research highlights: → Introduction of a reference Monte Carlo based depletion code with extended capabilities. → Verification and validation results for MCOR. → Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations

  7. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  8. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  9. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  10. Quantum statistical Monte Carlo methods and applications to spin systems

    International Nuclear Information System (INIS)

    Suzuki, M.

    1986-01-01

    A short review is given concerning the quantum statistical Monte Carlo method based on the equivalence theorem that d-dimensional quantum systems are mapped onto (d+1)-dimensional classical systems. The convergence property of this approximate tansformation is discussed in detail. Some applications of this general appoach to quantum spin systems are reviewed. A new Monte Carlo method, ''thermo field Monte Carlo method,'' is presented, which is an extension of the projection Monte Carlo method at zero temperature to that at finite temperatures

  11. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  12. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  13. Development of a practical Monte Carlo based fuel management system for the Penn State University Breazeale Research Reactor (PSBR)

    International Nuclear Information System (INIS)

    Tippayakul, Chanatip; Ivanov, Kostadin; Frederick Sears, C.

    2008-01-01

    A practical fuel management system for the he Pennsylvania State University Breazeale Research Reactor (PSBR) based on the advanced Monte Carlo methodology was developed from the existing fuel management tool in this research. Several modeling improvements were implemented to the old system. The improved fuel management system can now utilize the burnup dependent cross section libraries generated specifically for PSBR fuel and it is also able to update the cross sections of these libraries by the Monte Carlo calculation automatically. Considerations were given to balance the computation time and the accuracy of the cross section update. Thus, certain types of a limited number of isotopes, which are considered 'important', are calculated and updated by the scheme. Moreover, the depletion algorithm of the existing fuel management tool was replaced from the predictor only to the predictor-corrector depletion scheme to account for burnup spectrum changes during the burnup step more accurately. An intermediate verification of the fuel management system was performed to assess the correctness of the newly implemented schemes against HELIOS. It was found that the agreement of both codes is good when the same energy released per fission (Q values) is used. Furthermore, to be able to model the reactor at various temperatures, the fuel management tool is able to utilize automatically the continuous cross sections generated at different temperatures. Other additional useful capabilities were also added to the fuel management tool to make it easy to use and be practical. As part of the development, a hybrid nodal diffusion/Monte Carlo calculation was devised to speed up the Monte Carlo calculation by providing more converged initial source distribution for the Monte Carlo calculation from the nodal diffusion calculation. Finally, the fuel management system was validated against the measured data using several actual PSBR core loadings. The agreement of the predicted core

  14. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  15. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  16. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  17. A virtual-accelerator-based verification of a Monte Carlo dose calculation algorithm for electron beam treatment planning in homogeneous phantoms

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2006-01-01

    By introducing Monte Carlo (MC) techniques to the verification procedure of dose calculation algorithms in treatment planning systems (TPSs), problems associated with conventional measurements can be avoided and properties that are considered unmeasurable can be studied. The aim of the study is to implement a virtual accelerator, based on MC simulations, to evaluate the performance of a dose calculation algorithm for electron beams in a commercial TPS. The TPS algorithm is MC based and the virtual accelerator is used to study the accuracy of the algorithm in water phantoms. The basic test of the implementation of the virtual accelerator is successful for 6 and 12 MeV (γ < 1.0, 0.02 Gy/2 mm). For 18 MeV, there are problems in the profile data for some of the applicators, where the TPS underestimates the dose. For fields equipped with patient-specific inserts, the agreement is generally good. The exception is 6 MeV where there are slightly larger deviations. The concept of the virtual accelerator is shown to be feasible and has the potential to be a powerful tool for vendors and users

  18. NPP Temelin instrumentation and control system upgrade and verification

    International Nuclear Information System (INIS)

    Ubra, O.; Petrlik, J.

    1998-01-01

    Two units of Ver 1000 type of the Czech nuclear power plant Temelin, which are under construction are being upgraded with the latest instrumentation and control system delivered by WEC. To confirm that the functional design of the new Reactor Control and Limitation System, Turbine Control System and Plant Control System are in compliance with the Czech customer requirements and that these requirements are compatible with NPP Temelin upgraded technology, the verification of the control systems has been performed. The method of transient analysis has been applied. Some details of the NPP Temelin Reactor Control and Limitation System verification are presented.(author)

  19. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  20. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  1. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  2. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lou, K [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Rice University, Houston, TX (United States); Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Clark, J [Rice University, Houston, TX (United States)

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  3. Beam intensity scanner system for three dimensional dose verification of IMRT

    International Nuclear Information System (INIS)

    Vahc, Young W.; Kwon, Ohyun; Park, Kwangyl; Park, Kyung R.; Yi, Byung Y.; Kim, Keun M.

    2003-01-01

    Patient dose verification is clinically one of the most important parts in the treatment delivery of radiation therapy. The three dimensional (3D) reconstruction of dose distribution delivered to target volume helps to verify patient dose and determine the physical characteristics of beams used in IMRT. Here we present beam intensity scanner (BInS) system for the pre-treatment dosimetric verification of two dimensional photon intensity. The BInS is a radiation detector with a custom-made software for dose conversion of fluorescence signals from scintillator. The scintillator is used to produce fluorescence from the irradiation of 6 MV photons on a Varian Clinac 21EX. The digitized fluoroscopic signals obtained by digital video camera-based scintillator (DVCS) will be processed by our custom made software to reproduce 3D- relative dose distribution. For the intensity modulated beam (IMB), the BInS calculates absorbed dose in absolute beam fluence which is used for the patient dose distribution. Using BInS, we performed various measurements related to IMRT and found the following: (1) The 3D-dose profiles of the IMBs measured by the BInS demonstrate good agreement with radiographic film, pin type ionization chamber and Monte Carlo simulation. (2) The delivered beam intensity is altered by the mechanical and dosimetric properties of the collimation of dynamic and/or step MLC system. This is mostly due to leaf transmission, leaf penumbra scattered photons from the round edges of leaves, and geometry of leaf. (3) The delivered dose depends on the operational detail of how to make multi leaf opening. These phenomena result in a fluence distribution that can be substantially different from the initial and calculated intensity modulation and therefore, should be taken into account by the treatment planning for accurate dose calculations delivered to the target volume in IMRT. (author)

  4. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  5. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used...... to specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  6. Spectral history model in DYN3D: Verification against coupled Monte-Carlo thermal-hydraulic code BGCore

    International Nuclear Information System (INIS)

    Bilodid, Y.; Kotlyar, D.; Margulis, M.; Fridman, E.; Shwageraus, E.

    2015-01-01

    Highlights: • Pu-239 based spectral history method was tested on 3D BWR single assembly case. • Burnup of a BWR fuel assembly was performed with the nodal code DYN3D. • Reference solution was obtained by coupled Monte-Carlo thermal-hydraulic code BGCore. • The proposed method accurately reproduces moderator density history effect for BWR test case. - Abstract: This research focuses on the verification of a recently developed methodology accounting for spectral history effects in 3D full core nodal simulations. The traditional deterministic core simulation procedure includes two stages: (1) generation of homogenized macroscopic cross section sets and (2) application of these sets to obtain a full 3D core solution with nodal codes. The standard approach adopts the branch methodology in which the branches represent all expected combinations of operational conditions as a function of burnup (main branch). The main branch is produced for constant, usually averaged, operating conditions (e.g. coolant density). As a result, the spectral history effects that associated with coolant density variation are not taken into account properly. Number of methods to solve this problem (such as micro-depletion and spectral indexes) were developed and implemented in modern nodal codes. Recently, we proposed a new and robust method to account for history effects. The methodology was implemented in DYN3D and involves modification of the few-group cross section sets. The method utilizes the local Pu-239 concentration as an indicator of spectral history. The method was verified for PWR and VVER applications. However, the spectrum variation in BWR core is more pronounced due to the stronger coolant density change. The purpose of the current work is investigating the applicability of the method to BWR analysis. The proposed methodology was verified against recently developed BGCore system, which couples Monte Carlo neutron transport with depletion and thermal-hydraulic solvers and

  7. Dense time discretization technique for verification of real time systems

    International Nuclear Information System (INIS)

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  8. Using Monte Carlo Simulation To Improve Cargo Mass Estimates For International Space Station Commercial Resupply Flights

    Science.gov (United States)

    2016-12-01

    The Challenges of ISS Resupply .......................................... 23 F. THE IMPORTANCE OF MASS PROPERTIES IN SPACECRAFT AND MISSION DESIGN...Transportation System TBA trundle bearing assembly VLC verification loads cycle xv EXECUTIVE SUMMARY Resupplying the International Space Station...management priorities. This study addresses those challenges by developing Monte Carlo simulations based on over 13 years of as- flownSS resupply

  9. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  10. Applicability of quasi-Monte Carlo for lattice systems

    International Nuclear Information System (INIS)

    Ammon, Andreas; Deutsches Elektronen-Synchrotron; Hartung, Tobias; Jansen, Karl; Leovey, Hernan; Griewank, Andreas; Mueller-Preussker, Michael

    2013-11-01

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N -1/2 , where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N -1 , or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  11. Applicability of quasi-Monte Carlo for lattice systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hartung, Tobias [King' s College London (United Kingdom). Dept. of Mathematics; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leovey, Hernan; Griewank, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Mathematics; Mueller-Preussker, Michael [Berlin Humboldt-Univ. (Germany). Dept. of Physics

    2013-11-15

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N{sup -1/2}, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N{sup -1}, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  12. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  13. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  14. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  15. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  16. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

  17. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  18. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  19. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  20. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  1. Monte Carlo simulations for stereotactic radiotherapy system with various kilo-voltage x-ray energy

    International Nuclear Information System (INIS)

    Deloar, H.M.; Kunieda, E.; Kawase, T.; Kubo, Atsushi; Saitoh, H.; Myojoyama, A.; Ozaki, M.; Fujisaki, T.; Saito, K.

    2005-01-01

    Stereotactic radiotherapy (SRT) of lung tumors with a narrow and precise medium energy x-ray beam where the homogeneous high dose area will be confined within the tumors are desirable. A conventional x-ray CT with medium energy x-ray has been modified to develop a radiotherapy system for lung SRT. A cylindrical collimator (0.3 cm φ) made of tungsten was introduced to collimate the X-ray beam. The system was simulated with BEAMnrc(EGS4) Monte Carlo code and various x-ray energy spectra were generated to investigate the dose distributions with our kilo-voltage SRT system. Experiments were performed to acquire the energy spectra of 100, 120 and 135 kVp (kilo-voltage peak) from CT measurements and those results were compared with the spectra obtained from Monte Carlo simulations. Verifications of percentage of dose depth (PDD) for 120 and 147.5 kVp were investigated in a water phantom with experiments and Monte Carlo simulations. Finally dose distributions of 120, 135, 147.5, 200, 250, 300, 350, 400, 500 kVp spectra were investigated with lung phantom and human lung. The Percentage of Depth Dose (PDD) in the water phantom calculated from the experimental and simulated spectra of 120 and 147.5 kVp show good agreement with each other. The PDD of 147.5 and 120 kVp spectra at 9 cm depth was approximately 10% and 9%, respectively. Dose distributions around the lung tumor in the phantom and human for all x-ray energies were almost uniform but in the case of the human lung absorptions of dose at ribs for the energy lower than 135 kVp was more than 35% and those absorptions for the energy spectra of 147.5 kVp and above was less than 30%. This absorption gradually decreases with increasing x-ray energies. Uniform dose distributions in the lung region of human and thorax phantom demonstrated the possibility of SRT system with medium energy X-ray. A detail performance of this system as a kilo-voltage conformal radiotherapy is under investigations. (author)

  2. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  3. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  4. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  5. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  6. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  7. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  8. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  9. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  10. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  11. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  12. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  13. Verification tests for remote controlled inspection system in nuclear power plants

    International Nuclear Information System (INIS)

    Kohno, Tadaaki

    1986-01-01

    Following the increase of nuclear power plants, the total radiation exposure dose accompanying inspection and maintenance works tended to increase. Japan Power Engineering and Inspection Corp. carried out the verification test of a practical power reactor automatic inspection system from November, 1981, to March, 1986, and in this report, the state of having carried out this verification test is described. The objects of the verification test were the equipment which is urgently required for reducing radiation exposure dose, the possibility of realization of which is high, and which is important for ensuring the safety and reliability of plants, that is, an automatic ultrasonic flaw detector for the welded parts of bend pipes, an automatic disassembling and inspection system for control rod driving mechanism, a fuel automatic inspection system, and automatic decontaminating equipments for steam generator water chambers, primary system crud and radioactive gas in coolant. The results of the verification test of these equipments were judged as satisfactory, therefore, the application to actual plants is possible. (Kako, I.)

  14. Research on Monte Carlo improved quasi-static method for reactor space-time dynamics

    International Nuclear Information System (INIS)

    Xu Qi; Wang Kan; Li Shirui; Yu Ganglin

    2013-01-01

    With large time steps, improved quasi-static (IQS) method can improve the calculation speed for reactor dynamic simulations. The Monte Carlo IQS method was proposed in this paper, combining the advantages of both the IQS method and MC method. Thus, the Monte Carlo IQS method is beneficial for solving space-time dynamics problems of new concept reactors. Based on the theory of IQS, Monte Carlo algorithms for calculating adjoint neutron flux, reactor kinetic parameters and shape function were designed and realized. A simple Monte Carlo IQS code and a corresponding diffusion IQS code were developed, which were used for verification of the Monte Carlo IQS method. (authors)

  15. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  16. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  17. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  18. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  19. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  20. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  1. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  2. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  3. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  4. VALGALI: VERIFICATION AND VALIDATION TOOL FOR THE LIBRARIES PRODUCED BY THE GALILEE SYSTEM

    International Nuclear Information System (INIS)

    Mengelle, S.

    2011-01-01

    Full text: In this paper we present VALGALI the verification and validation tool for the libraries produced by the nuclear data processing system GALILEE. The aim of this system is to provide libraries with consistent physical data for various application codes (the deterministic transport code APOLLO2, the Monte Carlo transport code TRRIPOLI-4, the depletion code DARWIN, ...). For each library, are the data stored at the good place with the good format and so one. Are the libraries used by the various codes consistent. What is the physical quality of the cross sections and data present in the libraries. These three types of tests correspond to the classic stages of VandV. The great strength of VALGALI is to be generic and not dedicated to one application code Consequently, it is based on a common physical validation database which coverage is regularly increased. For all these test cases, the input data are declined for each relevant application code Moreover it can exist specific test case for each application code: At the present, VALGALI an check and validate the libraries of APOLLO2 and TRIPOLI4, but in the near future VALGALI wil also treat the libraries of DARWIN.

  5. RapidArc treatment verification in 3D using polymer gel dosimetry and Monte Carlo simulation

    DEFF Research Database (Denmark)

    Ceberg, Sofie; Gagne, Isabel; Gustafsson, Helen

    2010-01-01

    The aim of this study was to verify the advanced inhomogeneous dose distribution produced by a volumetric arc therapy technique (RapidArc™) using 3D gel measurements and Monte Carlo (MC) simulations. The TPS (treatment planning system)-calculated dose distribution was compared with gel measurements...

  6. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  7. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  8. Application of OMEGA Monte Carlo codes for radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Ayyangar, Komanduri M.; Jiang, Steve B.

    1998-01-01

    The accuracy of conventional dose algorithms for radiosurgery treatment planning is limited, due to the inadequate consideration of the lateral radiation transport and the difficulty of acquiring accurate dosimetric data for very small beams. In the present paper, some initial work on the application of Monte Carlo method in radiation treatment planning in general, and in radiosurgery treatment planning in particular, has been presented. Two OMEGA Monte Carlo codes, BEAM and DOSXYZ, are used. The BEAM code is used to simulate the transport of particles in the linac treatment head and radiosurgery collimator. A phase space file is obtained from the BEAM simulation for each collimator size. The DOSXYZ code is used to calculate the dose distribution in the patient's body reconstructed from CT slices using the phase space file as input. The accuracy of OMEGA Monte Carlo simulation for radiosurgery dose calculation is verified by comparing the calculated and measured basic dosimetric data for several radiosurgery beams and a 4 x 4 cm 2 conventional beam. The dose distributions for three clinical cases are calculated using OMEGA codes as the dose engine for an in-house developed radiosurgery treatment planning system. The verification using basic dosimetric data and the dose calculation for clinical cases demonstrate the feasibility of applying OMEGA Monte Carlo code system to radiosurgery treatment planning. (author)

  9. TLM.open: a SystemC/TLM Frontend for the CADP Verification Toolbox

    Directory of Open Access Journals (Sweden)

    Claude Helmstetter

    2014-04-01

    Full Text Available SystemC/TLM models, which are C++ programs, allow the simulation of embedded software before hardware low-level descriptions are available and are used as golden models for hardware verification. The verification of the SystemC/TLM models is an important issue since an error in the model can mislead the system designers or reveal an error in the specifications. An open-source simulator for SystemC/TLM is provided but there are no tools for formal verification.In order to apply model checking to a SystemC/TLM model, a semantics for standard C++ code and for specific SystemC/TLM features must be provided. The usual approach relies on the translation of the SystemC/TLM code into a formal language for which a model checker is available.We propose another approach that suppresses the error-prone translation effort. Given a SystemC/TLM program, the transitions are obtained by executing the original code using g++ and an extended SystemC library, and we ask the user to provide additional functions to store the current model state. These additional functions generally represent less than 20% of the size of the original model, and allow it to apply all CADP verification tools to the SystemC/TLM model itself.

  10. A New Approach to Nuclear Warhead Verification Using a Zero-Knowledge Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Glaser,; Alexander, [Princeton University

    2012-05-16

    Warhead verification systems proposed to date fundamentally rely on the use of information barriers to prevent the release of classified design information. Measurements with information carriers significantly increase the complexity of inspection systems, make their certification and authentication difficult, and may reduce the overall confidence in the verifiability of future arms- control agreements. This talk presents a proof-of-concept of a new approach to nuclear warhead verification that minimizes the role of information barriers from the outset and envisions instead an inspection system that a priori avoids leakage of sensitive information using a so-called zero-knowledge protocol. The proposed inspection system is based on the template-matching approach and relies on active interrogation of a test object with 14-MeV neutrons. The viability of the method is examined with MCNP Monte Carlo neutron transport calculations modeling the experimental setup, an investigation of different diversion scenarios, and an analysis of the simulated data showing that it does not contain information about the properties of the inspected object.

  11. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  12. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  13. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    Science.gov (United States)

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  14. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  15. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  16. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  17. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  18. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  19. Development of Monte Carlo decay gamma-ray transport calculation system

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Satoshi [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment; Kawasaki, Nobuo [Fujitsu Ltd., Tokyo (Japan); Kume, Etsuo [Japan Atomic Energy Research Inst., Center for Promotion of Computational Science and Engineering, Tokai, Ibaraki (Japan)

    2001-06-01

    In the DT fusion reactor, it is critical concern to evaluate the decay gamma-ray biological dose rates after the reactor shutdown exactly. In order to evaluate the decay gamma-ray biological dose rates exactly, three dimensional Monte Carlo decay gamma-ray transport calculation system have been developed by connecting the three dimensional Monte Carlo particle transport calculation code and the induced activity calculation code. The developed calculation system consists of the following four functions. (1) The operational neutron flux distribution is calculated by the three dimensional Monte Carlo particle transport calculation code. (2) The induced activities are calculated by the induced activity calculation code. (3) The decay gamma-ray source distribution is obtained from the induced activities. (4) The decay gamma-rays are generated by using the decay gamma-ray source distribution, and the decay gamma-ray transport calculation is conducted by the three dimensional Monte Carlo particle transport calculation code. In order to reduce the calculation time drastically, a biasing system for the decay gamma-ray source distribution has been developed, and the function is also included in the present system. In this paper, the outline and the detail of the system, and the execution example are reported. The evaluation for the effect of the biasing system is also reported. (author)

  20. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  1. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  2. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  3. Validation and verification of the ORNL Monte Carlo codes for nuclear safety analysis

    International Nuclear Information System (INIS)

    Emmett, M.B.

    1993-01-01

    The process of ensuring the quality of computer codes can be very time consuming and expensive. The Oak Ridge National Laboratory (ORNL) Monte Carlo codes all predate the existence of quality assurance (QA) standards and configuration control. The number of person-years and the amount of money spent on code development make it impossible to adhere strictly to all the current requirements. At ORNL, the Nuclear Engineering Applications Section of the Computing Applications Division is responsible for the development, maintenance, and application of the Monte Carlo codes MORSE and KENO. The KENO code is used for doing criticality analyses; the MORSE code, which has two official versions, CGA and SGC, is used for radiation transport analyses. Because KENO and MORSE were very thoroughly checked out over the many years of extensive use both in the United States and in the international community, the existing codes were open-quotes baselined.close quotes This means that the versions existing at the time the original configuration plan is written are considered to be validated and verified code systems based on the established experience with them

  4. Verification of tritium production evaluation procedure using Monte Carlo code MCNP for in-pile test of fusion blanket with JMTR

    International Nuclear Information System (INIS)

    Nagao, Y.; Nakamichi, K.; Tsuchiya, M.; Ishitsuka, E.; Kawamura, H.

    2000-01-01

    To evaluate exactly the total amount of tritium production in tritium breeding materials during in-pile test with JMTR, the 'tritium monitor' has been produced and evaluation of total tritium generation was done by using 'tritium monitor' in preliminary in-pile mock-up, and verification of procedure concerning tritium production evaluation was conducted by using Monte Carlo code MCNP and nuclear cross section library of FSXLIBJ3R2. Li-Al alloy (Li 3.4 wt.%, 95.5% enrichment of 6 Li) was selected as tritium monitor material for the evaluation on the total amount of tritium production in high 6 Li enriched materials. From the results of preliminary experiment, calculated amounts of total tritium production at each 'tritium monitor', which was installed in the preliminary in-pile mock-up, were about 50-290% higher than the measured values. Concerning tritium measurement, increase of measurement error in tritium leak form measuring system to measure small amount of tritium (0.2-0.7 mCi in tritium monitor) was found in the results of present experiment. The tendency for overestimation of calculated thermal neutron flux in the range of 1-6x10 13 n cm -2 per s was found in JMTR and the reason may be due to the beryllium cross section data base in JENDL3.2

  5. Verification of tritium production evaluation procedure using Monte Carlo code MCNP for in-pile test of fusion blanket with JMTR

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, Y. E-mail: nagao@jmtr.oarai.jaeri.go.jp; Nakamichi, K.; Tsuchiya, M.; Ishitsuka, E.; Kawamura, H

    2000-11-01

    To evaluate exactly the total amount of tritium production in tritium breeding materials during in-pile test with JMTR, the 'tritium monitor' has been produced and evaluation of total tritium generation was done by using 'tritium monitor' in preliminary in-pile mock-up, and verification of procedure concerning tritium production evaluation was conducted by using Monte Carlo code MCNP and nuclear cross section library of FSXLIBJ3R2. Li-Al alloy (Li 3.4 wt.%, 95.5% enrichment of {sup 6}Li) was selected as tritium monitor material for the evaluation on the total amount of tritium production in high {sup 6}Li enriched materials. From the results of preliminary experiment, calculated amounts of total tritium production at each 'tritium monitor', which was installed in the preliminary in-pile mock-up, were about 50-290% higher than the measured values. Concerning tritium measurement, increase of measurement error in tritium leak form measuring system to measure small amount of tritium (0.2-0.7 mCi in tritium monitor) was found in the results of present experiment. The tendency for overestimation of calculated thermal neutron flux in the range of 1-6x10{sup 13} n cm{sup -2} per s was found in JMTR and the reason may be due to the beryllium cross section data base in JENDL3.2.

  6. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  7. Quasi-Monte Carlo methods for lattice systems. A first look

    International Nuclear Information System (INIS)

    Jansen, K.; Cyprus Univ., Nicosia; Leovey, H.; Griewank, A.; Nube, A.; Humboldt-Universitaet, Berlin; Mueller-Preussker, M.

    2013-02-01

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N -1/2 , where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N -1 . We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  8. Quasi-Monte Carlo methods for lattice systems. A first look

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Leovey, H.; Griewank, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Mathematik; Nube, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Mueller-Preussker, M. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2013-02-15

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N{sup -1/2}, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N{sup -1}. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  9. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  10. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  11. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  12. Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    Science.gov (United States)

    Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.

    2017-12-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.

  13. Density scaling of phantom materials for a 3D dose verification system.

    Science.gov (United States)

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  14. Automated Image Acquisition System for the Verification of Copper-Brass Seal Images

    International Nuclear Information System (INIS)

    Stringa, E.; Bergonzi, C.; Littmann, F.; ); Marszalek, Y.; Tempesta, S.; )

    2015-01-01

    This paper describes a system for the verification of copper-brass seals realized by JRC according to DG ENER requirements. DG ENER processes about 20,000 metal seals per year. The verification of metal seals consists in visually checking the identity of a removed seal. The identity of a copper-brass seal is defined by a random stain pattern realized by the seal producer together with random scratches engraved when the seals are initialized ('seal production'). In order to verify that the seal returned from the field is the expected one its pattern is compared with an image taken during seal production. Formerly, seal initialization and verification were very heavy tasks as seal pictures were acquired with a camera one by one both in the initialization and verification stages. During the initialization the Nuclear Safeguards technicians had to place one by one new seals under a camera and acquire the related reference images. During the verification, the technician had to take used seals and place them one by one under a camera to take new pictures. The new images were presented to the technicians without any preprocessing and the technicians had to recognize the seal. The new station described in this paper has an automated image acquisition system allowing to easily process seals in batches of 100 seals. To simplify the verification, a software automatically centres and rotates the newly acquired seal image in order to perfectly overlap with the reference image acquired during the production phase. The new system significantly speeds up seal production and helps particularly with the demanding task of seal verification. As a large part of the seals is dealt with by a joint Euratom-IAEA team, the IAEA directly profits from this development. The new tool has been in routine use since mid 2013. (author)

  15. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  16. Verification of the Indicating Measuring Instruments Taking into Account their Instrumental Measurement Uncertainty

    Directory of Open Access Journals (Sweden)

    Zakharov Igor

    2017-12-01

    Full Text Available The specific features of the measuring instruments verification based on the results of their calibration are considered. It is noted that, in contrast to the verification procedure used in the legal metrology, the verification procedure for calibrated measuring instruments has to take into account the uncertainty of measurements into account. In this regard, a large number of measuring instruments, considered as those that are in compliance after verification in the legal metrology, turns out to be not in compliance after calibration. In this case, it is necessary to evaluate the probability of compliance of indicating measuring instruments. The procedure of compliance probability determination on the basis of the Monte Carlo method is considered. An example of calibration of a Vernier caliper is given.

  17. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  18. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  19. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  20. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  1. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-04] Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer... Following Information Title of Proposal: Enterprise Income Verification (EIV) System- Debts Owed to PHAs and...

  2. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  3. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  4. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  5. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  6. Review of quantum Monte Carlo methods and results for Coulombic systems

    International Nuclear Information System (INIS)

    Ceperley, D.

    1983-01-01

    The various Monte Carlo methods for calculating ground state energies are briefly reviewed. Then a summary of the charged systems that have been studied with Monte Carlo is given. These include the electron gas, small molecules, a metal slab and many-body hydrogen

  7. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Directory of Open Access Journals (Sweden)

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  8. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  9. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  10. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  11. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  12. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1999-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  13. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  14. Alien Registration Number Verification via the U.S. Citizenship and Immigration Service's Systematic Alien Verification for Entitlements System

    National Research Council Canada - National Science Library

    Ainslie, Frances M; Buck, Kelly R

    2008-01-01

    The purpose of this study was to evaluate the implications of conducting high-volume automated checks of the United States Citizenship and Immigration Services Systematic Allen Verification for Entitlements System (SAVE...

  15. Verification station for Sandia/Rockwell Plutonium Protection system

    International Nuclear Information System (INIS)

    Nicholson, N.; Hastings, R.D.; Henry, C.N.; Millegan, D.R.

    1979-04-01

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  16. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  17. Geometrical verification system using Adobe Photoshop in radiotherapy.

    Science.gov (United States)

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  18. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    Science.gov (United States)

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  19. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  20. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  1. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  2. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  3. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  4. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  5. Monte Carlo analysis of a control technique for a tunable white lighting system

    DEFF Research Database (Denmark)

    Chakrabarti, Maumita; Thorseth, Anders; Jepsen, Jørgen

    2017-01-01

    A simulated colour control mechanism for a multi-coloured LED lighting system is presented. The system achieves adjustable and stable white light output and allows for system-to-system reproducibility after application of the control mechanism. The control unit works using a pre-calibrated lookup...... table for an experimentally realized system, with a calibrated tristimulus colour sensor. A Monte Carlo simulation is used to examine the system performance concerning the variation of luminous flux and chromaticity of the light output. The inputs to the Monte Carlo simulation, are variations of the LED...... peak wavelength, the LED rated luminous flux bin, the influence of the operating conditions, ambient temperature, driving current, and the spectral response of the colour sensor. The system performance is investigated by evaluating the outputs from the Monte Carlo simulation. The outputs show...

  6. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  7. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  8. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Energy Technology Data Exchange (ETDEWEB)

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji [Department of Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2016-04-15

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  9. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    International Nuclear Information System (INIS)

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-01-01

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  10. Investigation of novel spent fuel verification system for safeguard application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Haneol; Yim, Man-Sung [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source.

  11. Investigation of novel spent fuel verification system for safeguard application

    International Nuclear Information System (INIS)

    Lee, Haneol; Yim, Man-Sung

    2016-01-01

    Radioactive waste, especially spent fuel, is generated from the operation of nuclear power plants. The final stage of radioactive waste management is disposal which isolates radioactive waste from the accessible environment and allows it to decay. The safety, security, and safeguard of a spent fuel repository have to be evaluated before its operation. Many researchers have evaluated the safety of a repository. These researchers calculated dose to public after the repository is closed depending on their scenario. Because most spent fuel repositories are non-retrievable, research on security or safeguards of spent fuel repositories have to be performed. Design based security or safeguard have to be developed for future repository designs. This study summarizes the requirements of future spent fuel repositories especially safeguards, and suggests a novel system which meets the safeguard requirements. Applying safeguards to a spent fuel repository is becoming increasingly important. The future requirements for a spent fuel repository are suggested by several expert groups, such as ASTOR in IAEA. The requirements emphasizes surveillance and verification. The surveillance and verification of spent fuel is currently accomplished by using the Cerenkov radiation detector while spent fuel is being stored in a fuel pool. This research investigated an advanced spent fuel verification system using a system which converts spent fuel radiation into electricity. The system generates electricity while it is conveyed from a transportation cask to a disposal cask. The electricity conversion system was verified in a lab scale experiment using an 8.51GBq Cs-137 gamma source

  12. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  13. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    Science.gov (United States)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  14. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  15. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    Science.gov (United States)

    2014-03-27

    Vehicle Code System (VCS), the Monte Carlo Adjoint SHielding (MASH), and the Monte Carlo n- Particle ( MCNP ) code. Of the three, the oldest and still most...widely utilized radiation transport code is MCNP . First created at Los Alamos National Laboratory (LANL) in 1957, the code simulated neutral...particle types, and previous versions of MCNP were repeatedly validated using both simple and complex 10 geometries [12, 13]. Much greater discussion and

  16. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  17. Television system for verification and documentation of treatment fields during intraoperative radiation therapy

    International Nuclear Information System (INIS)

    Fraass, B.A.; Harrington, F.S.; Kinsella, T.J.; Sindelar, W.F.

    1983-01-01

    Intraoperative radiation therapy (IORT) involves direct treatment of tumors or tumor beds with large single doses of radiation. The verification of the area to be treated before irradiation and the documentation of the treated area are critical for IORT, just as for other types of radiation therapy. A television system which allows the target area to be directly imaged immediately before irradiation has been developed. Verification and documentation of treatment fields has made the IORT television system indispensable

  18. A multi-microcomputer system for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Berg, B.; Krasemann, H.

    1981-01-01

    We propose a microcomputer system which allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and presumably many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 68000 microprocessor. One attraction if this processor is that it allows up to 16 M Byte random access memory. (orig.)

  19. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  20. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    Elmont, T.H.; Langner, Diana C.; MacArthur, D.W.; Mayo, D.R.; Smith, M.K.; Modenov, A.

    2005-01-01

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  1. Ongoing Work on Automated Verification of Noisy Nonlinear Systems with Ariadne

    NARCIS (Netherlands)

    Geretti, Luca; Bresolin, Davide; Collins, Pieter; Zivanovic Gonzalez, Sanja; Villa, Tiziano

    2017-01-01

    Cyber-physical systems (CPS) are hybrid systems that commonly consist of a discrete control part that operates in a continuous environment. Hybrid automata are a convenient model for CPS suitable for formal verification. The latter is based on reachability analysis of the system to trace its hybrid

  2. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  3. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  4. Study of TXRF experimental system by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Costa, Ana Cristina M.; Leitao, Roberta G.; Lopes, Ricardo T.; Anjos, Marcelino J.; Conti, Claudio C.

    2011-01-01

    The Total-Reflection X-ray Fluorescence (TXRF) technique offers unique possibilities to study the concentrations of a wide range of trace elements in various types of samples. Besides that, the TXRF technique is widely used to study the trace elements in biological, medical and environmental samples due to its multielemental character as well as simplicity of sample preparation and quantification methods used. In general the TXRF experimental setup is not simple and might require substantial experimental efforts. On the other hand, in recent years, experimental TXRF portable systems have been developed. It has motivated us to develop our own TXRF portable system. In this work we presented a first step in order to optimize a TXRF experimental setup using Monte Carlo simulation by MCNP code. The results found show that the Monte Carlo simulation method can be used to investigate the development of a TXRF experimental system before its assembly. (author)

  5. Diffusion Monte Carlo calculation of three-body systems

    International Nuclear Information System (INIS)

    Lu Mengjiao; Lin Qihu; Ren Zhongzhou

    2012-01-01

    The application of the diffusion Monte Carlo algorithm in three-body systems is studied. We develop a program and use it to calculate the property of various three-body systems. Regular Coulomb systems such as atoms, molecules, and ions are investigated. The calculation is then extended to exotic systems where electrons are replaced by muons. Some nuclei with neutron halos are also calculated as three-body systems consisting of a core and two external nucleons. Our results agree well with experiments and others' work. (authors)

  6. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  7. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  8. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  9. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    Kwon, I. W.; Seong, P. H.

    1996-01-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  10. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  11. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  12. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  13. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  14. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  15. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  16. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  17. SWAT2: The improved SWAT code system by incorporating the continuous energy Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Okuno, Hiroshi

    2003-01-01

    SWAT is a code system, which performs the burnup calculation by the combination of the neutronics calculation code, SRAC95 and the one group burnup calculation code, ORIGEN2.1. The SWAT code system can deal with the cell geometry in SRAC95. However, a precise treatment of resonance absorptions by the SRAC95 code using the ultra-fine group cross section library is not directly applicable to two- or three-dimensional geometry models, because of restrictions in SRAC95. To overcome this problem, SWAT2 which newly introduced the continuous energy Monte Carlo code, MVP into SWAT was developed. Thereby, the burnup calculation by the continuous energy in any geometry became possible. Moreover, using the 147 group cross section library called SWAT library, the reactions which are not dealt with by SRAC95 and MVP can be treated. OECD/NEA burnup credit criticality safety benchmark problems Phase-IB (PWR, a single pin cell model) and Phase-IIIB (BWR, fuel assembly model) were calculated as a verification of SWAT2, and the results were compared with the average values of calculation results of burnup calculation code of each organization. Through two benchmark problems, it was confirmed that SWAT2 was applicable to the burnup calculation of the complicated geometry. (author)

  18. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  19. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  20. Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE

    Directory of Open Access Journals (Sweden)

    Lisong Wang

    2018-01-01

    Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.

  1. A new verification film system for routine quality control of radiation fields: Kodak EC-L.

    Science.gov (United States)

    Hermann, A; Bratengeier, K; Priske, A; Flentje, M

    2000-06-01

    The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.

  2. Theoretical analysis of nuclear reactors (Phase III), I-V, Part V, Establishment of Monte Carlo method for solving the integral transport equation

    International Nuclear Information System (INIS)

    Pop-Jordanov, J.

    1963-02-01

    General mathematical Monte Carlo approach is described with the elements which enable solution of specific problems (verification was done by estimation of a simple integral). Special attention was devoted to systematic presentation which demanded explanation of fundamental topics of statistics and probability. This demands a procedure for modelling the stochastic process i.e. Monte Carlo method [sr

  3. Data-driven property verification of grey-box systems by Bayesian experiment design

    NARCIS (Netherlands)

    Haesaert, S.; Van den Hof, P.M.J.; Abate, A.

    2015-01-01

    A measurement-based statistical verification approach is developed for systems with partly unknown dynamics. These grey-box systems are subject to identification experiments which, new in this contribution, enable accepting or rejecting system properties expressed in a linear-time logic. We employ a

  4. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  5. Total skin electron therapy treatment verification: Monte Carlo simulation and beam characteristics of large non-standard electron fields

    International Nuclear Information System (INIS)

    Pavon, Ester Carrasco; Sanchez-Doblado, Francisco; Leal, Antonio; Capote, Roberto; Lagares, Juan Ignacio; Perucha, Maria; Arrans, Rafael

    2003-01-01

    Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm 2 electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm 2 field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm 2 open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R 50 , R 80 , R 100 , R p , E 0 ) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results proved the applicability of using MC

  6. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  7. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  8. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  9. FIR signature verification system characterizing dynamics of handwriting features

    Science.gov (United States)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  10. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  11. Monte Carlo verification of polymer gel dosimetry applied to radionuclide therapy: a phantom study

    International Nuclear Information System (INIS)

    Gear, J I; Partridge, M; Flux, G D; Charles-Edwards, E

    2011-01-01

    This study evaluates the dosimetric performance of the polymer gel dosimeter 'Methacrylic and Ascorbic acid in Gelatin, initiated by Copper' and its suitability for quality assurance and analysis of I-131-targeted radionuclide therapy dosimetry. Four batches of gel were manufactured in-house and sets of calibration vials and phantoms were created containing different concentrations of I-131-doped gel. Multiple dose measurements were made up to 700 h post preparation and compared to equivalent Monte Carlo simulations. In addition to uniformly filled phantoms the cross-dose distribution from a hot insert to a surrounding phantom was measured. In this example comparisons were made with both Monte Carlo and a clinical scintigraphic dosimetry method. Dose-response curves generated from the calibration data followed a sigmoid function. The gels appeared to be stable over many weeks of internal irradiation with a delay in gel response observed at 29 h post preparation. This was attributed to chemical inhibitors and slow reaction rates of long-chain radical species. For this reason, phantom measurements were only made after 190 h of irradiation. For uniformly filled phantoms of I-131 the accuracy of dose measurements agreed to within 10% when compared to Monte Carlo simulations. A radial cross-dose distribution measured using the gel dosimeter compared well to that calculated with Monte Carlo. Small inhomogeneities were observed in the dosimeter attributed to non-uniform mixing of monomer during preparation. However, they were not detrimental to this study where the quantitative accuracy and spatial resolution of polymer gel dosimetry were far superior to that calculated using scintigraphy. The difference between Monte Carlo and gel measurements was of the order of a few cGy, whilst with the scintigraphic method differences of up to 8 Gy were observed. A manipulation technique is also presented which allows 3D scintigraphic dosimetry measurements to be compared to polymer

  12. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  13. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Entropy Measurement for Biometric Verification Systems.

    Science.gov (United States)

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  15. Effect of error propagation of nuclide number densities on Monte Carlo burn-up calculations

    International Nuclear Information System (INIS)

    Tohjoh, Masayuki; Endo, Tomohiro; Watanabe, Masato; Yamamoto, Akio

    2006-01-01

    As a result of improvements in computer technology, the continuous energy Monte Carlo burn-up calculation has received attention as a good candidate for an assembly calculation method. However, the results of Monte Carlo calculations contain the statistical errors. The results of Monte Carlo burn-up calculations, in particular, include propagated statistical errors through the variance of the nuclide number densities. Therefore, if statistical error alone is evaluated, the errors in Monte Carlo burn-up calculations may be underestimated. To make clear this effect of error propagation on Monte Carlo burn-up calculations, we here proposed an equation that can predict the variance of nuclide number densities after burn-up calculations, and we verified this equation using enormous numbers of the Monte Carlo burn-up calculations by changing only the initial random numbers. We also verified the effect of the number of burn-up calculation points on Monte Carlo burn-up calculations. From these verifications, we estimated the errors in Monte Carlo burn-up calculations including both statistical and propagated errors. Finally, we made clear the effects of error propagation on Monte Carlo burn-up calculations by comparing statistical errors alone versus both statistical and propagated errors. The results revealed that the effects of error propagation on the Monte Carlo burn-up calculations of 8 x 8 BWR fuel assembly are low up to 60 GWd/t

  16. Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection

    International Nuclear Information System (INIS)

    Muhammad Subekti

    2009-01-01

    Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection. The present research was done for verification of previous developed method on Loss of Coolant Accident (LOCA) detection and perform simulations for knowing the sensitivity of the PWR monitoring system that applied neuro-expert method. The previous research continuing on present research, has developed and has tested the neuro-expert method for several anomaly detections in Nuclear Power Plant (NPP) typed Pressurized Water Reactor (PWR). Neuro-expert can detect the LOCA anomaly with sensitivity of primary coolant leakage of 7 gallon/min and the conventional method could not detect the primary coolant leakage of 30 gallon/min. Neuro expert method detects significantly LOCA anomaly faster than conventional system in Surry-1 NPP as well so that the impact risk is reducible. (author)

  17. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  18. Joint verification project on environmentally friendly coal utilization systems. Joint verification project on the water-saving coal preparation system; Kankyo chowagata sekitan riyo system kyodo jissho jigyo. Shosuigata sentan system kyodo jissho jigyo

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    In this verification project, clean technology which should be spread in China was verified and the base structure for its spread was prepared for the purpose of controlling emissions of environmental pollutants associated with the coal utilization in China and of contributing to secure energy acquisition of Japan. As joint verification projects, a general rehabilitation type coal preparation system was installed in the Wangfenggang coal preparation plant, and a central control coal preparation system was installed in the Qingtan coal preparation plant. In the former, a system is verified in which optimum operation, water-saving, high quality, and heightening of efficiency can be obtained by introducing two computing systems for operation control and quality control, various measuring instruments, and analyzers to coal preparation plants where analog operation is conducted helped by Russia and Porland and have problems about quality control. In the latter, a central control system achieving water saving is verified by introducing rapid ash meters, scales, desitometers and computers to coal preparation plants having zigzag or heavy-fluid cyclon and connecting various kinds of information through network. For fiscal 1994, investigation and study were conducted. 51 figs., 9 tabs.

  19. A silicon strip detector array for energy verification and quality assurance in heavy ion therapy.

    Science.gov (United States)

    Debrot, Emily; Newall, Matthew; Guatelli, Susanna; Petasecca, Marco; Matsufuji, Naruhiro; Rosenfeld, Anatoly B

    2018-02-01

    The measurement of depth dose profiles for range and energy verification of heavy ion beams is an important aspect of quality assurance procedures for heavy ion therapy facilities. The steep dose gradients in the Bragg peak region of these profiles require the use of detectors with high spatial resolution. The aim of this work is to characterize a one dimensional monolithic silicon detector array called the "serial Dose Magnifying Glass" (sDMG) as an independent ion beam energy and range verification system used for quality assurance conducted for ion beams used in heavy ion therapy. The sDMG detector consists of two linear arrays of 128 silicon sensitive volumes each with an effective size of 2mm × 50μm × 100μm fabricated on a p-type substrate at a pitch of 200 μm along a single axis of detection. The detector was characterized for beam energy and range verification by measuring the response of the detector when irradiated with a 290 MeV/u 12 C ion broad beam incident along the single axis of the detector embedded in a PMMA phantom. The energy of the 12 C ion beam incident on the detector and the residual energy of an ion beam incident on the phantom was determined from the measured Bragg peak position in the sDMG. Ad hoc Monte Carlo simulations of the experimental setup were also performed to give further insight into the detector response. The relative response profiles along the single axis measured with the sDMG detector were found to have good agreement between experiment and simulation with the position of the Bragg peak determined to fall within 0.2 mm or 1.1% of the range in the detector for the two cases. The energy of the beam incident on the detector was found to vary less than 1% between experiment and simulation. The beam energy incident on the phantom was determined to be (280.9 ± 0.8) MeV/u from the experimental and (280.9 ± 0.2) MeV/u from the simulated profiles. These values coincide with the expected energy of 281 MeV/u. The sDMG detector

  20. Meaningful timescales from Monte Carlo simulations of particle systems with hard-core interactions

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Liborio I., E-mail: liborio78@gmail.com

    2016-12-01

    A new Markov Chain Monte Carlo method for simulating the dynamics of particle systems characterized by hard-core interactions is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.

  1. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    International Nuclear Information System (INIS)

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  2. Safety verification of non-linear hybrid systems is quasi-decidable

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan

    2014-01-01

    Roč. 44, č. 1 (2014), s. 71-90 ISSN 0925-9856 R&D Projects: GA ČR GCP202/12/J060 Institutional support: RVO:67985807 Keywords : hybrid system s * safety verification * decidability * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 0.875, year: 2014

  3. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  4. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  5. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  6. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed...

  7. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  8. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Univ. of New Mexico, Albuquerque, NM

    2016-01-01

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  9. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.

    2016-11-29

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  10. Comparative Analysys of Speech Parameters for the Design of Speaker Verification Systems

    National Research Council Canada - National Science Library

    Souza, A

    2001-01-01

    Speaker verification systems are basically composed of three stages: feature extraction, feature processing and comparison of the modified features from speaker voice and from the voice that should be...

  11. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy

    Science.gov (United States)

    Moteabbed, M.; España, S.; Paganetti, H.

    2011-02-01

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as 11C, 15O, 13N, 30P and 38K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the

  12. Verification of the shift Monte Carlo code with the C5G7 reactor benchmark

    International Nuclear Information System (INIS)

    Sly, N. C.; Mervin, B. T.; Mosher, S. W.; Evans, T. M.; Wagner, J. C.; Maldonado, G. I.

    2012-01-01

    Shift is a new hybrid Monte Carlo/deterministic radiation transport code being developed at Oak Ridge National Laboratory. At its current stage of development, Shift includes a parallel Monte Carlo capability for simulating eigenvalue and fixed-source multigroup transport problems. This paper focuses on recent efforts to verify Shift's Monte Carlo component using the two-dimensional and three-dimensional C5G7 NEA benchmark problems. Comparisons were made between the benchmark eigenvalues and those output by the Shift code. In addition, mesh-based scalar flux tally results generated by Shift were compared to those obtained using MCNP5 on an identical model and tally grid. The Shift-generated eigenvalues were within three standard deviations of the benchmark and MCNP5-1.60 values in all cases. The flux tallies generated by Shift were found to be in very good agreement with those from MCNP. (authors)

  13. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    Science.gov (United States)

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  15. Theoretical analysis of nuclear reactors (Phase III), I-V, Part V, Establishment of Monte Carlo method for solving the integral transport equation; Razrada metoda teorijske analize nuklearnih reaktora (III faza) I-V, V Deo, Postavljanje Monte Carlo metode za resavanje integralnog oblika transportne jednacine

    Energy Technology Data Exchange (ETDEWEB)

    Pop-Jordanov, J [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1963-02-15

    General mathematical Monte Carlo approach is described with the elements which enable solution of specific problems (verification was done by estimation of a simple integral). Special attention was devoted to systematic presentation which demanded explanation of fundamental topics of statistics and probability. This demands a procedure for modelling the stochastic process i.e. Monte Carlo method. Dat je matematicki prilaz Monte Carlo metodi uopste, a po elementima koji dozvoljavaju konkretno resavanje izvesnih problema. (Provera je izvrsena na estimiranju prostog integrala). Narocito je vodjeno racuna o sistematicnosti izlaganja materije sto je mestimicno zahtevalo tretiranje i osnovnih pojmova, statistike i verovatnoce, a sve to skupa zahteva postupak modeliranja stohastickog procesa odnosno Monte Carlo metod (author)

  16. Improved local lattice Monte Carlo simulation for charged systems

    Science.gov (United States)

    Jiang, Jian; Wang, Zhen-Gang

    2018-03-01

    Maggs and Rossetto [Phys. Rev. Lett. 88, 196402 (2002)] proposed a local lattice Monte Carlo algorithm for simulating charged systems based on Gauss's law, which scales with the particle number N as O(N). This method includes two degrees of freedom: the configuration of the mobile charged particles and the electric field. In this work, we consider two important issues in the implementation of the method, the acceptance rate of configurational change (particle move) and the ergodicity in the phase space sampled by the electric field. We propose a simple method to improve the acceptance rate of particle moves based on the superposition principle for electric field. Furthermore, we introduce an additional updating step for the field, named "open-circuit update," to ensure that the system is fully ergodic under periodic boundary conditions. We apply this improved local Monte Carlo simulation to an electrolyte solution confined between two low dielectric plates. The results show excellent agreement with previous theoretical work.

  17. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  18. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  19. Monte Carlo simulation of hybrid systems: An example

    International Nuclear Information System (INIS)

    Bacha, F.; D'Alencon, H.; Grivelet, J.; Jullien, E.; Jejcic, A.; Maillard, J.; Silva, J.; Zukanovich, R.; Vergnes, J.

    1997-01-01

    Simulation of hybrid systems needs tracking of particles from the GeV (incident proton beam) range down to a fraction of eV (thermic neutrons). We show how a GEANT based Monte-Carlo program can achieve this, with a realistic computer time and accompanying tools. An example of a dedicated original actinide burner is simulated with this chain. 8 refs., 5 figs

  20. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X; Yin, Y; Lin, X [Shandong Cancer Hospital and Institute, China, Jinan, Shandong (China)

    2016-06-15

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system. Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.

  1. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Data.gov (United States)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  2. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  3. Real-Time System Verification by Kappa-Induction

    Science.gov (United States)

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  4. Formal modeling and verification of systems with self-x properties

    OpenAIRE

    Reif, Wolfgang

    2006-01-01

    Formal modeling and verification of systems with self-x properties / Matthias Güdemann, Frank Ortmeier and Wolfgang Reif. - In: Autonomic and trusted computing : third international conference, ATC 2006, Wuhan, China, September 3-6, 2006 ; proceedings / Laurence T. Yang ... (eds.). - Berlin [u.a.] : Springer, 2006. - S. 38-47. - (Lecture notes in computer science ; 4158)

  5. Atmosphere Re-Entry Simulation Using Direct Simulation Monte Carlo (DSMC Method

    Directory of Open Access Journals (Sweden)

    Francesco Pellicani

    2016-05-01

    Full Text Available Hypersonic re-entry vehicles aerothermodynamic investigations provide fundamental information to other important disciplines like materials and structures, assisting the development of thermal protection systems (TPS efficient and with a low weight. In the transitional flow regime, where thermal and chemical equilibrium is almost absent, a new numerical method for such studies has been introduced, the direct simulation Monte Carlo (DSMC numerical technique. The acceptance and applicability of the DSMC method have increased significantly in the 50 years since its invention thanks to the increase in computer speed and to the parallel computing. Anyway, further verification and validation efforts are needed to lead to its greater acceptance. In this study, the Monte Carlo simulator OpenFOAM and Sparta have been studied and benchmarked against numerical and theoretical data for inert and chemically reactive flows and the same will be done against experimental data in the near future. The results show the validity of the data found with the DSMC. The best setting of the fundamental parameters used by a DSMC simulator are presented for each software and they are compared with the guidelines deriving from the theory behind the Monte Carlo method. In particular, the number of particles per cell was found to be the most relevant parameter to achieve valid and optimized results. It is shown how a simulation with a mean value of one particle per cell gives sufficiently good results with very low computational resources. This achievement aims to reconsider the correct investigation method in the transitional regime where both the direct simulation Monte Carlo (DSMC and the computational fluid-dynamics (CFD can work, but with a different computational effort.

  6. Verification test of control rod system for HTR-10

    International Nuclear Information System (INIS)

    Zhou Huizhong; Diao Xingzhong; Huang Zhiyong; Cao Li; Yang Nianzu

    2002-01-01

    There are 10 sets of control rods and driving devices in 10 MW High Temperature Gas-cooled Test Reactor (HTR-10). The control rod system is the controlling and shutdown system of HTR-10, which is designed for reactor criticality, operation, and shutdown. In order to guarantee technical feasibility, a series of verification tests were performed, including room temperature test, thermal test, test after control rod system installed in HTR-10, and test of control rod system before HTR-10 first criticality. All the tests data showed that driving devices working well, control rods running smoothly up and down, random position settling well, and exactly position indicating

  7. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  8. Novel hybrid Monte Carlo/deterministic technique for shutdown dose rate analyses of fusion energy systems

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2014-01-01

    Highlights: •Develop the novel Multi-Step CADIS (MS-CADIS) hybrid Monte Carlo/deterministic method for multi-step shielding analyses. •Accurately calculate shutdown dose rates using full-scale Monte Carlo models of fusion energy systems. •Demonstrate the dramatic efficiency improvement of the MS-CADIS method for the rigorous two step calculations of the shutdown dose rate in fusion reactors. -- Abstract: The rigorous 2-step (R2S) computational system uses three-dimensional Monte Carlo transport simulations to calculate the shutdown dose rate (SDDR) in fusion reactors. Accurate full-scale R2S calculations are impractical in fusion reactors because they require calculating space- and energy-dependent neutron fluxes everywhere inside the reactor. The use of global Monte Carlo variance reduction techniques was suggested for accelerating the R2S neutron transport calculation. However, the prohibitive computational costs of these approaches, which increase with the problem size and amount of shielding materials, inhibit their ability to accurately predict the SDDR in fusion energy systems using full-scale modeling of an entire fusion plant. This paper describes a novel hybrid Monte Carlo/deterministic methodology that uses the Consistent Adjoint Driven Importance Sampling (CADIS) method but focuses on multi-step shielding calculations. The Multi-Step CADIS (MS-CADIS) methodology speeds up the R2S neutron Monte Carlo calculation using an importance function that represents the neutron importance to the final SDDR. Using a simplified example, preliminary results showed that the use of MS-CADIS enhanced the efficiency of the neutron Monte Carlo simulation of an SDDR calculation by a factor of 550 compared to standard global variance reduction techniques, and that the efficiency enhancement compared to analog Monte Carlo is higher than a factor of 10,000

  9. submitter Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    CERN Document Server

    Muraro, S; Belcari, N; Bisogni, M G; Camarlinghi, N; Cristoforetti, L; Guerra, A Del; Ferrari, A; Fracchiolla, F; Morrocchi, M; Righetto, R; Sala, P; Schwarz, M; Sportelli, G; Topi, A; Rosso, V

    2017-01-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two plana...

  10. Dielectric response of periodic systems from quantum Monte Carlo calculations.

    Science.gov (United States)

    Umari, P; Willamson, A J; Galli, Giulia; Marzari, Nicola

    2005-11-11

    We present a novel approach that allows us to calculate the dielectric response of periodic systems in the quantum Monte Carlo formalism. We employ a many-body generalization for the electric-enthalpy functional, where the coupling with the field is expressed via the Berry-phase formulation for the macroscopic polarization. A self-consistent local Hamiltonian then determines the ground-state wave function, allowing for accurate diffusion quantum Monte Carlo calculations where the polarization's fixed point is estimated from the average on an iterative sequence, sampled via forward walking. This approach has been validated for the case of an isolated hydrogen atom and then applied to a periodic system, to calculate the dielectric susceptibility of molecular-hydrogen chains. The results found are in excellent agreement with the best estimates obtained from the extrapolation of quantum-chemistry calculations.

  11. Six types Monte Carlo for estimating the current unavailability of Markov system with dependent repair

    International Nuclear Information System (INIS)

    Xiao Gang; Li Zhizhong

    2004-01-01

    Based on integral equaiton describing the life-history of Markov system, six types of estimators of the current unavailability of Markov system with dependent repair are propounded. Combining with the biased sampling of state transition time of system, six types of Monte Carlo for estimating the current unavailability are given. Two numerical examples are given to deal with the variances and efficiencies of the six types of Monte Carlo methods. (authors)

  12. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  13. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  14. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  15. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  16. Translating Activity Diagram from Duration Calculus for Modeling of Real-Time Systems and its Formal Verification using UPPAAL and DiVinE

    Directory of Open Access Journals (Sweden)

    Muhammad Abdul Basit Ur Rehman

    2016-01-01

    Full Text Available The RTS (Real-Time Systems are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus implementaion based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost.

  17. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  18. Automated four-dimensional Monte Carlo workflow using log files and real-time motion monitoring

    International Nuclear Information System (INIS)

    Sibolt, P; Andersen, C E; Cronholm, R O; Heath, E; Behrens, C F

    2017-01-01

    With emerging techniques for tracking and gating methods in radiotherapy of lung cancer patients, there is an increasing need for efficient four-dimensional Monte Carlo (4DMC) based quality assurance (QA). An automated and flexible workflow for 4DMC QA, based on the 4DdefDOSXYZnrc user code, has been developed in python. The workflow has been tested and verified using an in-house developed dosimetry system comprised of a dynamic thorax phantom constructed for plastic scintillator dosimetry. The workflow is directly compatible with any treatment planning system and can also be triggered by the appearance of linac log files. It has minimum user interaction and, with the use of linac log files, it provides a method for verification of the actually delivered dose in the patient geometry. (paper)

  19. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  20. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  1. Use of Monte Carlo computation in benchmarking radiotherapy treatment planning system algorithms

    International Nuclear Information System (INIS)

    Lewis, R.D.; Ryde, S.J.S.; Seaby, A.W.; Hancock, D.A.; Evans, C.J.

    2000-01-01

    Radiotherapy treatments are becoming more complex, often requiring the dose to be calculated in three dimensions and sometimes involving the application of non-coplanar beams. The ability of treatment planning systems to accurately calculate dose under a range of these and other irradiation conditions requires evaluation. Practical assessment of such arrangements can be problematical, especially when a heterogeneous medium is used. This work describes the use of Monte Carlo computation as a benchmarking tool to assess the dose distribution of external photon beam plans obtained in a simple heterogeneous phantom by several commercially available 3D and 2D treatment planning system algorithms. For comparison, practical measurements were undertaken using film dosimetry. The dose distributions were calculated for a variety of irradiation conditions designed to show the effects of surface obliquity, inhomogeneities and missing tissue above tangential beams. The results show maximum dose differences of 47% between some planning algorithms and film at a point 1 mm below a tangentially irradiated surface. Overall, the dose distribution obtained from film was most faithfully reproduced by the Monte Carlo N-Particle results illustrating the potential of Monte Carlo computation in evaluating treatment planning system algorithms. (author)

  2. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Kostyuchenko, V.I.; Makarova, A.S.; Ryazantsev, O.B.; Samarin, S.I.; Uglov, A.S.

    2013-01-01

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  3. Efficiencies of dynamic Monte Carlo algorithms for off-lattice particle systems with a single impurity

    KAUST Repository

    Novotny, M.A.; Watanabe, Hiroshi; Ito, Nobuyasu

    2010-01-01

    The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing

  4. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  5. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  6. International exchange on nuclear safety related expert systems: The role of software verification and validation

    International Nuclear Information System (INIS)

    Sun, B.K.H.

    1996-01-01

    An important lesson learned from the Three Mile Island accident is that human errors can be significant contributors to risk. Recent advancement in computer hardware and software technology helped make expert system techniques potentially viable tools for improving nuclear power plant safety and reliability. As part of the general man-machine interface technology, expert systems have recently become increasingly prominent as a potential solution to a number of previously intractable problems in many phases of human activity, including operation, maintenance, and engineering functions. Traditional methods for testing and analyzing analog systems are no longer adequate to handle the increased complexity of software systems. The role of Verification and Validation (V and V) is to add rigor to the software development and maintenance cycle to guarantee the high level confidence needed for applications. Verification includes the process and techniques for confirming that all the software requirements in one stage of the development are met before proceeding on to the next stage. Validation involves testing the integrated software and hardware system to ensure that it reliably fulfills its intended functions. Only through a comprehensive V and V program can a high level of confidence be achieved. There exist many different standards and techniques for software verification and validation, yet they lack uniform approaches that provides adequate levels of practical guidance which can be used by users for nuclear power plant applications. There is a need to unify different approaches for addressing software verification and validation and to develop practical and cost effective guidelines for user and regulatory acceptance. (author). 8 refs

  7. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  8. Dosimetry for synchrotron stereotactic radiotherapy: Monte Carlo simulations and radiosensitive gels; Dosimetrie pour la radiotherapie stereotaxique en rayonnement synchrotron: calculs Monte-Carlo et gels radiosensibles

    Energy Technology Data Exchange (ETDEWEB)

    Boudou, C

    2006-09-15

    High grade gliomas are extremely aggressive brain tumours. Specific techniques combining the presence of high atomic number elements within the tumour to an irradiation with a low x-rays (below 100 keV) beam from a synchrotron source were proposed. For the sake of clinical trials, the use of treatment planning system has to be foreseen as well as tailored dosimetry protocols. Objectives of this thesis work were (1) the development of a dose calculation tools based on Monte Carlo code for particles transport and (2) the implementation of an experimental method for the three dimensional verification of the dose delivered. The dosimetric tool is an interface between tomography images from patient or sample and the M.C.N.P.X. general purpose code. Besides, dose distributions were measured through a radiosensitive polymer gel, providing acceptable results compared to calculations.

  9. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  10. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  11. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    Science.gov (United States)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  12. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  13. Development and verification of the neutron diffusion solver for the GeN-Foam multi-physics platform

    International Nuclear Information System (INIS)

    Fiorina, Carlo; Kerkar, Nordine; Mikityuk, Konstantin; Rubiolo, Pablo; Pautz, Andreas

    2016-01-01

    Highlights: • Development and verification of a neutron diffusion solver based on OpenFOAM. • Integration in the GeN-Foam multi-physics platform. • Implementation and verification of acceleration techniques. • Implementation of isotropic discontinuity factors. • Automatic adjustment of discontinuity factors. - Abstract: The Laboratory for Reactor Physics and Systems Behaviour at the PSI and the EPFL has been developing in recent years a new code system for reactor analysis based on OpenFOAM®. The objective is to supplement available legacy codes with a modern tool featuring state-of-the-art characteristics in terms of scalability, programming approach and flexibility. As part of this project, a new solver has been developed for the eigenvalue and transient solution of multi-group diffusion equations. Several features distinguish the developed solver from other available codes, in particular: object oriented programming to ease code modification and maintenance; modern parallel computing capabilities; use of general unstructured meshes; possibility of mesh deformation; cell-wise parametrization of cross-sections; and arbitrary energy group structure. In addition, the solver is integrated into the GeN-Foam multi-physics solver. The general features of the solver and its integration with GeN-Foam have already been presented in previous publications. The present paper describes the diffusion solver in more details and provides an overview of new features recently implemented, including the use of acceleration techniques and discontinuity factors. In addition, a code verification is performed through a comparison with Monte Carlo results for both a thermal and a fast reactor system.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    Science.gov (United States)

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  15. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  16. Weak lensing magnification in the Dark Energy Survey Science Verification data

    Science.gov (United States)

    Garcia-Fernandez, M.; Sanchez, E.; Sevilla-Noarbe, I.; Suchyta, E.; Huff, E. M.; Gaztanaga, E.; Aleksić, J.; Ponce, R.; Castander, F. J.; Hoyle, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Benoit-Lévy, A.; Bernstein, G. M.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Eifler, T. F.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gschwend, J.; Gutierrez, G.; James, D. J.; Jarvis, M.; Kirk, D.; Krause, E.; Kuehn, K.; Kuropatkin, N.; Lahav, O.; Lima, M.; MacCrann, N.; Maia, M. A. G.; March, M.; Marshall, J. L.; Melchior, P.; Miquel, R.; Mohr, J. J.; Plazas, A. A.; Romer, A. K.; Roodman, A.; Rykoff, E. S.; Scarpine, V.; Schubnell, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Tarle, G.; Thomas, D.; Walker, A. R.; Wester, W.; DES Collaboration

    2018-05-01

    In this paper, the effect of weak lensing magnification on galaxy number counts is studied by cross-correlating the positions of two galaxy samples, separated by redshift, using the Dark Energy Survey Science Verification data set. This analysis is carried out for galaxies that are selected only by its photometric redshift. An extensive analysis of the systematic effects, using new methods based on simulations is performed, including a Monte Carlo sampling of the selection function of the survey.

  17. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Science.gov (United States)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  18. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  19. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    International Nuclear Information System (INIS)

    Haapanen, P.

    1995-01-01

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  20. VerifEYE: a real-time meat inspection system for the beef processing industry

    Science.gov (United States)

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  1. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  2. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  3. Monte Carlo study of the multiquark systems

    International Nuclear Information System (INIS)

    Kerbikov, B.O.; Polikarpov, M.I.; Zamolodchikov, A.B.

    1986-01-01

    Random walks have been used to calculate the energies of the ground states in systems of N=3, 6, 9, 12 quarks. Multiquark states with N>3 are unstable with respect to the spontaneous dissociation into color singlet hadrons. The modified Green's function Monte Carlo algorithm which proved to be more simple and much accurate than the conventional few body methods have been employed. In contrast to other techniques, the same equations are used for any number of particles, while the computer time increases only linearly V, S the number of particles

  4. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  5. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  6. Study on critical effect in lattice homogenization via Monte Carlo method

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    In contrast to the traditional deterministic lattice codes, generating the homogenization multigroup constants via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum. thus provides more accuracy parameters. An infinite lattice of identical symmetric motives is usually assumed when performing the homogenization. However, the finite size of a reactor is reality and it should influence the lattice calculation. In practice of the homogenization with Monte Carlo method, B N theory is applied to take the leakage effect into account. The fundamental mode with the buckling B is used as a measure of the finite size. The critical spectrum in the solution of 0-dimensional fine-group B 1 equations is used to correct the weighted spectrum for homogenization. A PWR prototype core is examined to verify that the presented method indeed generates few group constants effectively. In addition, a zero power physical experiment verification is performed. The results show that B N theory is adequate for leakage correction in the multigroup constants generation via Monte Carlo method. (authors)

  7. Expert system verification and validation survey. Delivery 3: Recommendations

    Science.gov (United States)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  8. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  9. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  10. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  11. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    International Nuclear Information System (INIS)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  12. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States); Watkins, W

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  13. A study of Monte Carlo methods for weak approximations of stochastic particle systems in the mean-field?

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-08

    I discuss using single level and multilevel Monte Carlo methods to compute quantities of interests of a stochastic particle system in the mean-field. In this context, the stochastic particles follow a coupled system of Ito stochastic differential equations (SDEs). Moreover, this stochastic particle system converges to a stochastic mean-field limit as the number of particles tends to infinity. I start by recalling the results of applying different versions of Multilevel Monte Carlo (MLMC) for particle systems, both with respect to time steps and the number of particles and using a partitioning estimator. Next, I expand on these results by proposing the use of our recent Multi-index Monte Carlo method to obtain improved convergence rates.

  14. A study of Monte Carlo methods for weak approximations of stochastic particle systems in the mean-field?

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-01

    I discuss using single level and multilevel Monte Carlo methods to compute quantities of interests of a stochastic particle system in the mean-field. In this context, the stochastic particles follow a coupled system of Ito stochastic differential equations (SDEs). Moreover, this stochastic particle system converges to a stochastic mean-field limit as the number of particles tends to infinity. I start by recalling the results of applying different versions of Multilevel Monte Carlo (MLMC) for particle systems, both with respect to time steps and the number of particles and using a partitioning estimator. Next, I expand on these results by proposing the use of our recent Multi-index Monte Carlo method to obtain improved convergence rates.

  15. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  16. Dosimetry for synchrotron stereotactic radiotherapy: Monte Carlo simulations and radiosensitive gels

    International Nuclear Information System (INIS)

    Boudou, C.

    2006-09-01

    High grade gliomas are extremely aggressive brain tumours. Specific techniques combining the presence of high atomic number elements within the tumour to an irradiation with a low x-rays (below 100 keV) beam from a synchrotron source were proposed. For the sake of clinical trials, the use of treatment planning system has to be foreseen as well as tailored dosimetry protocols. Objectives of this thesis work were (1) the development of a dose calculation tools based on Monte Carlo code for particles transport and (2) the implementation of an experimental method for the three dimensional verification of the dose delivered. The dosimetric tool is an interface between tomography images from patient or sample and the M.C.N.P.X. general purpose code. Besides, dose distributions were measured through a radiosensitive polymer gel, providing acceptable results compared to calculations

  17. The performance of a hybrid analytical-Monte Carlo system response matrix in pinhole SPECT reconstruction

    International Nuclear Information System (INIS)

    El Bitar, Z; Pino, F; Candela, C; Ros, D; Pavía, J; Rannou, F R; Ruibal, A; Aguiar, P

    2014-01-01

    It is well-known that in pinhole SPECT (single-photon-emission computed tomography), iterative reconstruction methods including accurate estimations of the system response matrix can lead to submillimeter spatial resolution. There are two different methods for obtaining the system response matrix: those that model the system analytically using an approach including an experimental characterization of the detector response, and those that make use of Monte Carlo simulations. Methods based on analytical approaches are faster and handle the statistical noise better than those based on Monte Carlo simulations, but they require tedious experimental measurements of the detector response. One suggested approach for avoiding an experimental characterization, circumventing the problem of statistical noise introduced by Monte Carlo simulations, is to perform an analytical computation of the system response matrix combined with a Monte Carlo characterization of the detector response. Our findings showed that this approach can achieve high spatial resolution similar to that obtained when the system response matrix computation includes an experimental characterization. Furthermore, we have shown that using simulated detector responses has the advantage of yielding a precise estimate of the shift between the point of entry of the photon beam into the detector and the point of interaction inside the detector. Considering this, it was possible to slightly improve the spatial resolution in the edge of the field of view. (paper)

  18. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Poucet, A.; Contini, S.; Bellezza, F.

    2001-01-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  19. A GIS support system for declaration and verification

    Energy Technology Data Exchange (ETDEWEB)

    Poucet, A; Contini, S; Bellezza, F [European Commission, Joint Research Centre, Institute for Systems Informatics and Safety (ISIS), Ispra (Italy)

    2001-07-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  20. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  1. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  2. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  3. Reactor physics verification of the MCNP6 unstructured mesh capability

    Energy Technology Data Exchange (ETDEWEB)

    Burke, T. P. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States); Kiedrowski, B. C.; Martz, R. L. [X-Computational Physics Division, Monte Carlo Codes Group, Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States)

    2013-07-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  4. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  5. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  6. Monte Carlo methods for medical physics a practical introduction

    CERN Document Server

    Schuemann, Jan; Paganetti, Harald

    2018-01-01

    The Monte Carlo (MC) method, established as the gold standard to predict results of physical processes, is now fast becoming a routine clinical tool for applications that range from quality control to treatment verification. This book provides a basic understanding of the fundamental principles and limitations of the MC method in the interpretation and validation of results for various scenarios. It shows how user-friendly and speed optimized MC codes can achieve online image processing or dose calculations in a clinical setting. It introduces this essential method with emphasis on applications in hardware design and testing, radiological imaging, radiation therapy, and radiobiology.

  7. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  8. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  9. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  10. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  11. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  12. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  13. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  14. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  15. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, D [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Testa, M; Park, Y [Massachusetts General Hospital, Boston, MA (United States); Schneider, R; Moteabbed, M [General Hospital, Boston, MA (United States); Janssens, G; Prieels, D [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Orban de Xivry, J [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Lu, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Bentefour, E

    2014-06-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.

  16. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    International Nuclear Information System (INIS)

    Samuel, D; Testa, M; Park, Y; Schneider, R; Moteabbed, M; Janssens, G; Prieels, D; Orban de Xivry, J; Lu, H; Bentefour, E

    2014-01-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient

  17. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  18. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  19. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  20. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  1. 75 FR 38765 - Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and...

    Science.gov (United States)

    2010-07-06

    ..., facility assessment services, certifications of quantity and quality, import product inspections, and... control number. These include export certification, inspection of section 8e import products, and...] Domestic Origin Verification System Questionnaire and Regulations Governing Inspection and Certification of...

  2. Efficiencies of dynamic Monte Carlo algorithms for off-lattice particle systems with a single impurity

    KAUST Repository

    Novotny, M.A.

    2010-02-01

    The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing Markov Chains method are given. Simulation results are presented to confirm the theoretical efficiencies. © 2010.

  3. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  4. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  5. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  6. Monte Carlo Techniques for the Comprehensive Modeling of Isotopic Inventories in Future Nuclear Systems and Fuel Cycles. Final Report

    International Nuclear Information System (INIS)

    Paul P.H. Wilson

    2005-01-01

    The development of Monte Carlo techniques for isotopic inventory analysis has been explored in order to facilitate the modeling of systems with flowing streams of material through varying neutron irradiation environments. This represents a novel application of Monte Carlo methods to a field that has traditionally relied on deterministic solutions to systems of first-order differential equations. The Monte Carlo techniques were based largely on the known modeling techniques of Monte Carlo radiation transport, but with important differences, particularly in the area of variance reduction and efficiency measurement. The software that was developed to implement and test these methods now provides a basis for validating approximate modeling techniques that are available to deterministic methodologies. The Monte Carlo methods have been shown to be effective in reproducing the solutions of simple problems that are possible using both stochastic and deterministic methods. The Monte Carlo methods are also effective for tracking flows of materials through complex systems including the ability to model removal of individual elements or isotopes in the system. Computational performance is best for flows that have characteristic times that are large fractions of the system lifetime. As the characteristic times become short, leading to thousands or millions of passes through the system, the computational performance drops significantly. Further research is underway to determine modeling techniques to improve performance within this range of problems. This report describes the technical development of Monte Carlo techniques for isotopic inventory analysis. The primary motivation for this solution methodology is the ability to model systems of flowing material being exposed to varying and stochastically varying radiation environments. The methodology was developed in three stages: analog methods which model each atom with true reaction probabilities (Section 2), non-analog methods

  7. OGRE, Monte-Carlo System for Gamma Transport Problems

    International Nuclear Information System (INIS)

    1984-01-01

    1 - Nature of physical problem solved: The OGRE programme system was designed to calculate, by Monte Carlo methods, any quantity related to gamma-ray transport. The system is represented by two examples - OGRE-P1 and OGRE-G. The OGRE-P1 programme is a simple prototype which calculates dose rate on one side of a slab due to a plane source on the other side. The OGRE-G programme, a prototype of a programme utilizing a general-geometry routine, calculates dose rate at arbitrary points. A very general source description in OGRE-G may be employed by reading a tape prepared by the user. 2 - Method of solution: Case histories of gamma rays in the prescribed geometry are generated and analyzed to produce averages of any desired quantity which, in the case of the prototypes, are gamma-ray dose rates. The system is designed to achieve generality by ease of modification. No importance sampling is built into the prototypes, a very general geometry subroutine permits the treatment of complicated geometries. This is essentially the same routine used in the O5R neutron transport system. Boundaries may be either planes or quadratic surfaces, arbitrarily oriented and intersecting in arbitrary fashion. Cross section data is prepared by the auxiliary master cross section programme XSECT which may be used to originate, update, or edit the master cross section tape. The master cross section tape is utilized in the OGRE programmes to produce detailed tables of macroscopic cross sections which are used during the Monte Carlo calculations. 3 - Restrictions on the complexity of the problem: Maximum cross-section array information may be estimated by a given formula for a specific problem. The number of regions must be less than or equal to 50

  8. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  9. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  10. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  11. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  12. Linear models to perform treaty verification tasks for enhanced information security

    International Nuclear Information System (INIS)

    MacGahan, Christopher J.; Kupinski, Matthew A.; Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A.

    2017-01-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  13. Linear models to perform treaty verification tasks for enhanced information security

    Energy Technology Data Exchange (ETDEWEB)

    MacGahan, Christopher J., E-mail: cmacgahan@optics.arizona.edu [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Sandia National Laboratories, Livermore, CA 94551 (United States); Kupinski, Matthew A. [College of Optical Sciences, The University of Arizona, 1630 E. University Blvd, Tucson, AZ 85721 (United States); Brubaker, Erik M.; Hilton, Nathan R.; Marleau, Peter A. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2017-02-01

    Linear mathematical models were applied to binary-discrimination tasks relevant to arms control verification measurements in which a host party wishes to convince a monitoring party that an item is or is not treaty accountable. These models process data in list-mode format and can compensate for the presence of variability in the source, such as uncertain object orientation and location. The Hotelling observer applies an optimal set of weights to binned detector data, yielding a test statistic that is thresholded to make a decision. The channelized Hotelling observer applies a channelizing matrix to the vectorized data, resulting in a lower dimensional vector available to the monitor to make decisions. We demonstrate how incorporating additional terms in this channelizing-matrix optimization offers benefits for treaty verification. We present two methods to increase shared information and trust between the host and monitor. The first method penalizes individual channel performance in order to maximize the information available to the monitor while maintaining optimal performance. Second, we present a method that penalizes predefined sensitive information while maintaining the capability to discriminate between binary choices. Data used in this study was generated using Monte Carlo simulations for fission neutrons, accomplished with the GEANT4 toolkit. Custom models for plutonium inspection objects were measured in simulation by a radiation imaging system. Model performance was evaluated and presented using the area under the receiver operating characteristic curve.

  14. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  15. Linear filtering applied to Monte Carlo criticality calculations

    International Nuclear Information System (INIS)

    Morrison, G.W.; Pike, D.H.; Petrie, L.M.

    1975-01-01

    A significant improvement in the acceleration of the convergence of the eigenvalue computed by Monte Carlo techniques has been developed by applying linear filtering theory to Monte Carlo calculations for multiplying systems. A Kalman filter was applied to a KENO Monte Carlo calculation of an experimental critical system consisting of eight interacting units of fissile material. A comparison of the filter estimate and the Monte Carlo realization was made. The Kalman filter converged in five iterations to 0.9977. After 95 iterations, the average k-eff from the Monte Carlo calculation was 0.9981. This demonstrates that the Kalman filter has the potential of reducing the calculational effort of multiplying systems. Other examples and results are discussed

  16. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  17. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  18. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  19. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  20. Monte Carlo calculations of neutron thermalization in a heterogeneous system

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, T

    1959-07-15

    The slowing down of neutrons in a heterogeneous system (a slab geometry) of uranium and heavy water has been investigated by Monte Carlo methods. Effects on the neutron spectrum due to the thermal motions of the scattering and absorbing atoms are taken into account. It has been assumed that the speed distribution of the moderator atoms are Maxwell-Boltzmann in character.

  1. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  2. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    1999-01-01

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  3. Verification on reliability of heat exchanger for primary cooling system

    International Nuclear Information System (INIS)

    Koike, Sumio; Gorai, Shigeru; Onoue, Ryuji; Ohtsuka, Kaoru

    2010-07-01

    Prior to the JMTR refurbishment, verification on reliability of the heat exchangers for primary cooling system was carried out to investigate an integrity of continuously use component. From a result of the significant corrosion, decrease of tube thickness, crack were not observed on the heat exchangers, and integrity of heat exchangers were confirmed. In the long terms usage of the heat exchangers, the maintenance based on periodical inspection and a long-term maintenance plan is scheduled. (author)

  4. Formal specification and verification of interactive systems with plasticity: Applications to nuclear-plant supervision

    International Nuclear Information System (INIS)

    Oliveira, Raquel Araujo de

    2015-01-01

    The advent of ubiquitous computing and the increasing variety of platforms and devices change user expectations in terms of user interfaces. Systems should be able to adapt themselves to their context of use, i.e., the platform (e.g. a PC or a tablet), the users who interact with the system (e.g. administrators or regular users), and the environment in which the system executes (e.g. a dark room or outdoor). The capacity of a UI to withstand variations in its context of use while preserving usability is called plasticity. Plasticity provides users with different versions of a UI. Although it enhances UI capabilities, plasticity adds complexity to the development of user interfaces: the consistency between multiple versions of a given UI should be ensured. Given the large number of possible versions of a UI, it is time-consuming and error prone to check these requirements by hand. Some automation must be provided to verify plasticity.This complexity is further increased when it comes to UIs of safety-critical systems. Safety-critical systems are systems in which a failure has severe consequences. The complexity of such systems is reflected in the UIs, which are now expected not only to provide correct, intuitive, non-ambiguous and adaptable means for users to accomplish a goal, but also to cope with safety requirements aiming to make sure that systems are reasonably safe before they enter the market. Several techniques to ensure quality of systems in general exist, which can also be used to safety-critical systems. Formal verification provides a rigorous way to perform verification, which is suitable for safety-critical systems. Our contribution is an approach to verify safety-critical interactive systems provided with plastic UIs using formal methods. Using a powerful tool-support, our approach permits:-The verification of sets of properties over a model of the system. Using model checking, our approach permits the verification of properties over the system formal

  5. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  6. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  7. Optimization of accelerator target and detector for portal imaging using Monte Carlo simulation and experiment

    International Nuclear Information System (INIS)

    Flampouri, S.; Evans, P.M.; Partridge, M.; Nahum, A.E.; Verhaegen, A.E.; Spezi, E.

    2002-01-01

    Megavoltage portal images suffer from poor quality compared to those produced with kilovoltage x-rays. Several authors have shown that the image quality can be improved by modifying the linear accelerator to generate more low-energy photons. This work addresses the problem of using Monte Carlo simulation and experiment to optimize the beam and detector combination to maximize image quality for a given patient thickness. A simple model of the whole imaging chain was developed for investigation of the effect of the target parameters on the quality of the image. The optimum targets (6 mm thick aluminium and 1.6 mm copper) were installed in an Elekta SL25 accelerator. The first beam will be referred to as Al6 and the second as Cu1.6. A tissue-equivalent contrast phantom was imaged with the 6 MV standard photon beam and the experimental beams with standard radiotherapy and mammography film/screen systems. The arrangement with a thin Al target/mammography system improved the contrast from 1.4 cm bone in 5 cm water to 19% compared with 2% for the standard arrangement of a thick, high-Z target/radiotherapy verification system. The linac/phantom/detector system was simulated with the BEAM/EGS4 Monte Carlo code. Contrast calculated from the predicted images was in good agreement with the experiment (to within 2.5%). The use of MC techniques to predict images accurately, taking into account the whole imaging system, is a powerful new method for portal imaging system design optimization. (author)

  8. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  9. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  10. Selection and verification of safety parameters in safety parameter display system for nuclear power plants

    International Nuclear Information System (INIS)

    Zhang Yuangfang

    1992-02-01

    The method and results for safety parameter selection and its verification in safety parameter display system of nuclear power plants are introduced. According to safety analysis, the overall safety is divided into six critical safety functions, and a certain amount of safety parameters which can represent the integrity degree of each function and the causes of change are strictly selected. The verification of safety parameter selection is carried out from the view of applying the plant emergency procedures and in the accident man oeuvres on a full scale nuclear power plant simulator

  11. Verification and testing of the RTOS for safety-critical embedded systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Na Young [Seoul National University, Seoul (Korea, Republic of); Kim, Jin Hyun; Choi, Jin Young [Korea University, Seoul (Korea, Republic of); Sung, Ah Young; Choi, Byung Ju [Ewha Womans University, Seoul (Korea, Republic of); Lee, Jang Soo [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system.

  12. Verification and testing of the RTOS for safety-critical embedded systems

    International Nuclear Information System (INIS)

    Lee, Na Young; Kim, Jin Hyun; Choi, Jin Young; Sung, Ah Young; Choi, Byung Ju; Lee, Jang Soo

    2003-01-01

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system

  13. In pursuit of carbon accountability: the politics of REDD+ measuring, reporting and verification systems

    NARCIS (Netherlands)

    Gupta, A.; Lövbrand, E.; Turnhout, E.; Vijge, M.J.

    2012-01-01

    This article reviews critical social science analyses of carbonaccounting and monitoring, reporting and verification (MRV) systems associated with reducing emissions from deforestation, forest degradation and conservation, sustainable use and enhancement of forest carbon stocks (REDD+). REDD+ MRV

  14. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    DEFF Research Database (Denmark)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  15. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  16. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  17. An automated portal verification system for the tangential breast portal field

    International Nuclear Information System (INIS)

    Yin, F.-F.; Lai, W.; Chen, C. W.; Nelson, D. F.

    1995-01-01

    Purpose/Objective: In order to ensure the treatment is delivered as planned, a portal image is acquired in the accelerator and is compared to the reference image. At present, this comparison is performed by radiation oncologists based on the manually-identified features, which is both time-consuming and potentially error-prone. With the introduction of various electronic portal imaging devices, real-time patient positioning correction is becoming clinically feasible to replace time-delayed analysis using films. However, this procedure requires present of radiation oncologists during patient treatment which is not cost-effective and practically not realistic. Therefore, the efficiency and quality of radiation therapy could be substantially improved if this procedure can be automated. The purpose of this study is to develop a fully computerized verification system for the radiation therapy of breast cancer for which a similar treatment setup is generally employed. Materials/Methods: The automated verification system involves image acquisition, image feature extraction, feature correlation between reference and portal images, and quantitative evaluation of patient setup. In this study, a matrix liquid ion-chamber EPID was used to acquire digital portal images which is directly attached to Varian CL2100C accelerator. For effective use of computation memory, the 12-bit gray levels in original portal images were quantized to form a range of 8-bit gray levels. A typical breast portal image includes three important components: breast and lung tissues in the treatment field, air space within the treatment field, and non-irradiated region. A hierarchical region processing technique was developed to separate these regions sequentially. The inherent hierarchical features were formulated based on different radiation attenuation for different regions as: treatment field edge -- breast skin line -- chest wall. Initially, a combination of a Canny edge detector and a constrained

  18. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  19. MO-AB-BRA-03: Development of Novel Real Time in Vivo EPID Treatment Verification for Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, G; Podesta, M [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Reniers, B [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Research Group NuTeC, CMK, Hasselt University, Agoralaan Gebouw H, Diepenbeek B-3590 (Belgium); Verhaegen, F [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Medical Physics Unit, Department of Oncology, McGill University, Montreal, Quebec H3G 1A4 (Canada)

    2016-06-15

    Purpose: High Dose Rate (HDR) brachytherapy treatments are employed worldwide to treat a wide variety of cancers. However, in vivo dose verification remains a challenge with no commercial dosimetry system available to verify the treatment dose delivered to the patient. We propose a novel dosimetry system that couples an independent Monte Carlo (MC) simulation platform and an amorphous silicon Electronic Portal Imaging Device (EPID) to provide real time treatment verification. Methods: MC calculations predict the EPID response to the photon fluence emitted by the HDR source by simulating the patient, the source dwell positions and times, and treatment complexities such as tissue compositions/densities and different applicators. Simulated results are then compared against EPID measurements acquired with ∼0.14s time resolution which allows dose measurements for each dwell position. The EPID has been calibrated using an Ir-192 HDR source and experiments were performed using different phantoms, including tissue equivalent materials (PMMA, lung and bone). A source positioning accuracy of 0.2 mm, without including the afterloader uncertainty, was ensured using a robotic arm moving the source. Results: An EPID can acquire 3D Cartesian source positions and its response varies significantly due to differences in the material composition/density of the irradiated object, allowing detection of changes in patient geometry. The panel time resolution allows dose rate and dwell time measurements. Moreover, predicted EPID images obtained from clinical treatment plans provide anatomical information that can be related to the patient anatomy, mostly bone and air cavities, localizing the source inside of the patient using its anatomy as reference. Conclusion: Results obtained show the feasibility of the proposed dose verification system that is capable to verify all the brachytherapy treatment steps in real time providing data about treatment delivery quality and also applicator

  20. Verification of absorbed dose calculation with XIO Radiotherapy Treatment Planning System

    International Nuclear Information System (INIS)

    Bokulic, T.; Budanec, M.; Frobe, A.; Gregov, M.; Kusic, Z.; Mlinaric, M.; Mrcela, I.

    2013-01-01

    Modern radiotherapy relies on computerized treatment planning systems (TPS) for absorbed dose calculation. Most TPS require a detailed model of a given machine and therapy beams. International Atomic Energy Agency (IAEA) recommends acceptance testing for the TPS (IAEA-TECDOC-1540). In this study we present customization of those tests for measurements with the purpose of verification of beam models intended for clinical use in our department. Elekta Synergy S linear accelerator installation and data acquisition for Elekta CMS XiO 4.62 TPS was finished in 2011. After the completion of beam modelling in TPS, tests were conducted in accordance with the IAEA protocol for TPS dose calculation verification. The deviations between the measured and calculated dose were recorded for 854 points and 11 groups of tests in a homogenous phantom. Most of the deviations were within tolerance. Similar to previously published results, results for irregular L shaped field and asymmetric wedged fields were out of tolerance for certain groups of points.(author)

  1. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  2. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  3. Verification and Validation of Flight-Critical Systems

    Science.gov (United States)

    Brat, Guillaume

    2010-01-01

    For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.

  4. Verification and validation of the safety parameter display system for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  5. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  6. Systems guide to MCNP (Monte Carlo Neutron and Photon Transport Code)

    International Nuclear Information System (INIS)

    Kirk, B.L.; West, J.T.

    1984-06-01

    The subject of this report is the implementation of the Los Alamos National Laboratory Monte Carlo Neutron and Photon Transport Code - Version 3 (MCNP) on the different types of computer systems, especially the IBM MVS system. The report supplements the documentation of the RSIC computer code package CCC-200/MCNP. Details of the procedure to follow in executing MCNP on the IBM computers, either in batch mode or interactive mode, are provided

  7. The dynamic flowgraph methodology as a safety analysis tool : programmable electronic system design and verification

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2002-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DFM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DFM, and

  8. Monte Carlo technique for local perturbations in multiplying systems

    International Nuclear Information System (INIS)

    Bernnat, W.

    1974-01-01

    The use of the Monte Carlo method for the calculation of reactivity perturbations in multiplying systems due to changes in geometry or composition requires a correlated sampling technique to make such calculations economical or in the case of very small perturbations even feasible. The technique discussed here is suitable for local perturbations. Very small perturbation regions will be treated by an adjoint mode. The perturbation of the source distribution due to the changed system and its reaction on the reactivity worth or other values of interest is taken into account by a fission matrix method. The formulation of the method and its application are discussed. 10 references. (U.S.)

  9. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  10. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations

  11. Dosimetric verification of small fields in the lung using lung-equivalent polymer gel and Monte Carlo simulation.

    Science.gov (United States)

    Gharehaghaji, Nahideh; Dadgar, Habib Alah

    2018-01-01

    The main purpose of this study was evaluate a polymer-gel-dosimeter (PGD) for three-dimensional verification of dose distributions in the lung that is called lung-equivalent gel (LEG) and then to compare its result with Monte Carlo (MC) method. In the present study, to achieve a lung density for PGD, gel is beaten until foam is obtained, and then sodium dodecyl sulfate is added as a surfactant to increase the surface tension of the gel. The foam gel was irradiated with 1 cm × 1 cm field size in the 6 MV photon beams of ONCOR SIEMENS LINAC, along the central axis of the gel. The LEG was then scanned on a 1.5 Tesla magnetic resonance imaging scanner after irradiation using a multiple-spin echo sequence. Least-square fitting the pixel values from 32 consecutive images using a single exponential decay function derived the R2 relaxation rates. Moreover, 6 and 18 MV photon beams of ONCOR SIEMENS LINAC are simulated using MCNPX MC Code. The MC model is used to calculate the depth dose water and low-density water resembling the soft tissue and lung, respectively. Percentages of dose reduction in the lung region relative to homogeneous phantom for 6 MV photon beam were 44.6%, 39%, 13%, and 7% for 0.5 cm × 0.5 cm, 1 cm × 1 cm, 2 cm × 2 cm, and 3 cm × 3 cm fields, respectively. For 18 MV photon beam, the results were found to be 82%, 69%, 46%, and 25.8% for the same field sizes, respectively. Preliminary results show good agreement between depth dose measured with the LEG and the depth dose calculated using MCNP code. Our study showed that the dose reduction with small fields in the lung was very high. Thus, inaccurate prediction of absorbed dose inside the lung and also lung/soft-tissue interfaces with small photon beams may lead to critical consequences for treatment outcome.

  12. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  13. Burnup calculations using Monte Carlo method

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Degweker, S.B.

    2009-01-01

    In the recent years, interest in burnup calculations using Monte Carlo methods has gained momentum. Previous burn up codes have used multigroup transport theory based calculations followed by diffusion theory based core calculations for the neutronic portion of codes. The transport theory methods invariably make approximations with regard to treatment of the energy and angle variables involved in scattering, besides approximations related to geometry simplification. Cell homogenisation to produce diffusion, theory parameters adds to these approximations. Moreover, while diffusion theory works for most reactors, it does not produce accurate results in systems that have strong gradients, strong absorbers or large voids. Also, diffusion theory codes are geometry limited (rectangular, hexagonal, cylindrical, and spherical coordinates). Monte Carlo methods are ideal to solve very heterogeneous reactors and/or lattices/assemblies in which considerable burnable poisons are used. The key feature of this approach is that Monte Carlo methods permit essentially 'exact' modeling of all geometrical detail, without resort to ene and spatial homogenization of neutron cross sections. Monte Carlo method would also be better for in Accelerator Driven Systems (ADS) which could have strong gradients due to the external source and a sub-critical assembly. To meet the demand for an accurate burnup code, we have developed a Monte Carlo burnup calculation code system in which Monte Carlo neutron transport code is coupled with a versatile code (McBurn) for calculating the buildup and decay of nuclides in nuclear materials. McBurn is developed from scratch by the authors. In this article we will discuss our effort in developing the continuous energy Monte Carlo burn-up code, McBurn. McBurn is intended for entire reactor core as well as for unit cells and assemblies. Generally, McBurn can do burnup of any geometrical system which can be handled by the underlying Monte Carlo transport code

  14. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  15. Results of verifications of the control automatic exposure in equipment of RX with CR systems

    International Nuclear Information System (INIS)

    Ruiz Manzano, P.; Rivas Ballarin, M. A.; Ortega Pardina, P.; Villa Gazulla, D.; Calvo Carrillo, S.; Canellas Anoz, M.; Millan Cebrian, E.

    2013-01-01

    After the entry into force in 2012, the new Spanish Radiology quality control protocol lists and discusses the results obtained after verification of the automatic control of exposure in computed radiography systems. (Author)

  16. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    International Nuclear Information System (INIS)

    Yamashita, M; Kokubo, M; Takahashi, R; Takayama, K; Tanabe, H; Sueoka, M; Okuuchi, N; Ishii, M; Iwamoto, Y; Tachibana, H

    2016-01-01

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  17. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    Energy Technology Data Exchange (ETDEWEB)

    Yamashita, M; Kokubo, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takayama, K [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tanabe, H; Sueoka, M; Okuuchi, N [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Ishii, M; Iwamoto, Y [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  18. Electroacoustic verification of frequency modulation systems in cochlear implant users.

    Science.gov (United States)

    Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari

    2017-12-26

    The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of

  19. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  20. Monte Carlo Analysis of the Accelerator-Driven System at Kyoto University Research Reactor Institute

    Directory of Open Access Journals (Sweden)

    Wonkyeong Kim

    2016-04-01

    Full Text Available An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan, a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft–Walton type accelerator, which generates the external neutron source by deuterium–tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.

  1. Monte Carlo analysis of the accelerator-driven system at Kyoto University Research Reactor Institute

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Kyeong; Lee, Deok Jung [Nuclear Engineering Division, Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Hyun Chul [VHTR Technology Development Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Pyeon, Cheol Ho [Nuclear Engineering Science Division, Kyoto University Research Reactor Institute, Osaka (Japan); Shin, Ho Cheol [Core and Fuel Analysis Group, Korea Hydro and Nuclear Power Central Research Institute, Daejeon (Korea, Republic of)

    2016-04-15

    An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan), a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft-Walton type accelerator, which generates the external neutron source by deuterium-tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.

  2. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  3. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  4. The MCU-RFFI Monte Carlo code for reactor design applications

    International Nuclear Information System (INIS)

    Gomin, E.A.; Maiorov, L.V.

    1995-01-01

    MCU-RFFI is a general-purpose, continuous-energy, general geometry Monte Carlo code for solving external source or criticality problems for neutron transport in the energy range of 20 MeV to 10 -5 eV. The main fields of MCU-RFFI applications are as follows: (a) nuclear data validation; (b) design calculations (space reactors and other); (c) verification of design codes. MCU-RFFI is also supplied with tools to check the accuracy of design codes. These tools permit the user to calculate: the few group parameters of reactor cells, including the diffusion coefficients defined in a variety of ways, reaction rates for various nuclei, energy and space bins, and the kinetic parameters of systems, taking into account delayed neutrons. Boundary conditions include vacuum, white and specular reflection, and the condition of translational symmetry. The criticals with the neutron leakage given by the buckling vector may be calculated by solving Benoist's problem. The curve of criticality coefficient dependence on buckling may be determined during the single code run and critical buckling may be determined. Double heterogeneous systems with fuel elements containing many thousands of spherical microcells can be solved

  5. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  6. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  7. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  8. Application of the measurement-based Monte Carlo method in nasopharyngeal cancer patients for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Yeh, C.Y.; Lee, C.C.; Chao, T.C.; Lin, M.H.; Lai, P.A.; Liu, F.H.; Tung, C.J.

    2014-01-01

    This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs

  9. Development of the criticality capability for the SAM-CE Monte Carlo System

    International Nuclear Information System (INIS)

    Lichtenstein, H.; Troubetzkoy, E.; Steinberg, H.; Cohen, M.O.

    1979-04-01

    A criticality capabilty has been developed and implemented in the SAM-CE Monte Carlo system. The data processing component, SAM-X, preserves, to any required accuracy, the data quality inherent in the ENDF/B library. The generated data is Doppler-broadened and includes (where applicable) probability tables for the unresolved resonance range, and thermal-scattering law data. Curves of several total and partial cross sections are generated and displayed. The Monte Carlo component, SAM-F, includes several eigenvalue estimators and variance reduction schemes. Stratification was found to effect significant improvement in calculational efficiency, but the usefulness of importance sampling is marginal in criticality problems. The entire system has been installed at BNL, for the analysis of TRX benchmarks. The TRX-1 and TRX-2 cell calculations have been performed, with estimated eigenvalues of 1.1751 +- 0.0016 and 1.1605 +- .0015, respectively. These results are shown to be statistically consistent with other sources

  10. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  11. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  12. Preface of Special issue on Automated Verification of Critical Systems (AVoCS'14)

    NARCIS (Netherlands)

    Huisman, Marieke; van de Pol, Jaco

    2016-01-01

    AVoCS 2014, the 14th International Conference on Automated Verification of Critical Systems has been hosted by the University of Twente, and has taken place in Enschede, Netherlands, on 24–26 September, 2014. The aim of the AVoCS series is to contribute to the interaction and exchange of ideas among

  13. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  14. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  15. Dual-use benefits of the CTBT verification system

    International Nuclear Information System (INIS)

    Meade, C.E.F.

    1999-01-01

    Since it has been completed in September 1996, the CTBT has been signed by 151 countries. Awaiting the 44 ratifications and entry into force, all of the nuclear powers have imposed unilateral moratoriums on nuclear test explosions. The end of these weapons development activities is often cited as the principal benefit of the CTBT. As the world begins to implement the Treaty, it has become clear that the development and operation of the CTBT verification system will provide a wide range of additional benefits if the data analysis products are available for dual-purpose applications. As this paper describes these could have economic and social implications, especially for countries with limited technical infrastructures. These involve, seismic monitoring, mineral exploration, scientific and technical training

  16. 'PET -Compton' system. Comparative evaluation with PET system using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Diaz Garcia, Angelina; Arista Romeu, Eduardo; Abreu Alfonso, Yamiel; Leyva Fabelo, Antonio; Pinnera Hernandez, Ibrahin; Bolannos Perez, Lourdes; Rubio Rodriguez, Juan A; Perez Morales, Jose M.; Arce Dubois, Pedro; Vela Morales, Oscar; Willmott Zappacosta, Carlos

    2011-01-01

    Positron Emission Tomography (PET) in small animals has actually achieved spatial resolution round about 1 mm and currently there are under study different approaches to improve this spatial resolution. One of them combines PET technology with Compton Cameras. This paper presents the idea of the so called 'PET-Compton' systems and includes comparative evaluation of spatial resolution and global efficiency in both PET and PET-Compton system by means of Monte Carlo simulations using Geant4 code. Simulation is done on a PET-Compton system consisting of LYSO-LuYAP scintillating detectors of particular small animal PET scanner named 'Clear-PET' and for Compton detectors based on CdZnTe semiconductor. A group of radionuclides that emits a positron (e + ) and γ quantum almost simultaneously and fulfills some selection criteria for their possible use in PET-Compton systems for medical and biological applications were studied under simulation conditions. (Author)

  17. Specification and verification of the RTOS for plant protection systems

    International Nuclear Information System (INIS)

    Kim, Jin Hyun; Ahn, Young Ah; Lee, Su-Young; Choi, Jin Young; Lee, Na Young

    2004-01-01

    PLC is a computer system for instrumentation and control (I and C) systems such as control of machinery on factory assembly lines. control of machinery on factory assembly lines and Nucleare power plants. In nuclear power industry, systems is classified into 3 classes- Non-safety, safety-related and safety-critical up to integrity on system's using purpose. If PLC is used for controlling reactor in nuclear power plant, it should be identified as safety-critical. PLC has several I and C logics in software, including real-time operating system (RTOS). Hence, RTOS must be also proved that it is safe and reliable by various way and methods. In this paper, we apply formal methods to a development of RTOS for PLC in safety-critical level; Statecharts for specification and model checking for verification. In this paper, we give the results of applying formal methods to RTOS. (author)

  18. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    Science.gov (United States)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  19. REQUIREMENT VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM

    Science.gov (United States)

    2017-09-01

    VERIFICATION AND SYSTEMS ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM by Theresa L. Thomas September... ENGINEERING TECHNICAL REVIEW (SETR) ON A COMMERCIAL DERIVATIVE AIRCRAFT (CDA) PROGRAM 5. FUNDING NUMBERS 6. AUTHOR(S) Theresa L. Thomas 7...CODE 13. ABSTRACT (maximum 200 words) The Naval Air Systems Command (NAVAIR) systems engineering technical review (SETR) process does not

  20. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  1. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  2. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  3. European Train Control System: A Case Study in Formal Verification

    Science.gov (United States)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  4. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    International Nuclear Information System (INIS)

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-01-01

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results

  5. Reliability program plan for the Kilowatt Isotope Power System (KIPS) technology verification phase

    International Nuclear Information System (INIS)

    1978-01-01

    Ths document is an integral part of the Kilowatt Isotope Power System (KIPS) Program Plan. This document defines the KIPS Reliability Program Plan for the Technology Verification Phase. This document delineates the reliability assurance tasks that are to be accomplished by Sundstrand and its suppliers during the design, fabrication and testing of the KIPS

  6. Head simulation of linear accelerators and spectra considerations using EGS4 Monte Carlo code in a PC

    Energy Technology Data Exchange (ETDEWEB)

    Malatara, G; Kappas, K [Medical Physics Department, Faculty of Medicine, University of Patras, 265 00 Patras (Greece); Sphiris, N [Ethnodata S.A., Athens (Greece)

    1994-12-31

    In this work, a Monte Carlo EGS4 code was used to simulate radiation transport through linear accelerators to produce and score energy spectra and angular distributions of 6, 12, 15 and 25 MeV bremsstrahlung photons exiting from different accelerator treatment heads. The energy spectra was used as input for a convolution method program to calculate the tissue-maximum ratio in water. 100.000 histories are recorded in the scoring plane for each simulation. The validity of the Monte Carlo simulation and the precision to the calculated spectra have been verified experimentally and were in good agreement. We believe that the accurate simulation of the different components of the linear accelerator head is very important for the precision of the results. The results of the Monte Carlo and the Convolution Method can be compared with experimental data for verification and they are powerful and practical tools to generate accurate spectra and dosimetric data. (authors). 10 refs,5 figs, 2 tabs.

  7. Head simulation of linear accelerators and spectra considerations using EGS4 Monte Carlo code in a PC

    International Nuclear Information System (INIS)

    Malatara, G.; Kappas, K.; Sphiris, N.

    1994-01-01

    In this work, a Monte Carlo EGS4 code was used to simulate radiation transport through linear accelerators to produce and score energy spectra and angular distributions of 6, 12, 15 and 25 MeV bremsstrahlung photons exiting from different accelerator treatment heads. The energy spectra was used as input for a convolution method program to calculate the tissue-maximum ratio in water. 100.000 histories are recorded in the scoring plane for each simulation. The validity of the Monte Carlo simulation and the precision to the calculated spectra have been verified experimentally and were in good agreement. We believe that the accurate simulation of the different components of the linear accelerator head is very important for the precision of the results. The results of the Monte Carlo and the Convolution Method can be compared with experimental data for verification and they are powerful and practical tools to generate accurate spectra and dosimetric data. (authors)

  8. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology, University of Washington, 1959 N. E. Pacific Street, Seattle, Washington 98195 (United States)

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  9. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  10. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  11. Usefulness of the Monte Carlo method in reliability calculations

    International Nuclear Information System (INIS)

    Lanore, J.M.; Kalli, H.

    1977-01-01

    Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels

  12. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: NEW CONDENSATOR, INC.--THE CONDENSATOR DIESEL ENGINE RETROFIT CRANKCASE VENTILATION SYSTEM

    Science.gov (United States)

    EPA's Environmental Technology Verification Program has tested New Condensator Inc.'s Condensator Diesel Engine Retrofit Crankcase Ventilation System. Brake specific fuel consumption (BSFC), the ratio of engine fuel consumption to the engine power output, was evaluated for engine...

  14. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  15. Portal verification using the KODAK ACR 2000 RT storage phosphor plate system and EC films. A semiquantitative comparison.

    Science.gov (United States)

    Geyer, Peter; Blank, Hilbert; Alheit, Horst

    2006-03-01

    The suitability of the storage phosphor plate system ACR 2000 RT (Eastman Kodak Corp., Rochester, MN, USA), that is destined for portal verification as well as for portal simulation imaging in radiotherapy, had to be proven by the comparison with a highly sensitive verification film. The comparison included portal verification images of different regions (head and neck, thorax, abdomen, and pelvis) irradiated with 6- and 15-MV photons and electrons. Each portal verification image was done at the storage screen and the EC film as well, using the EC-L cassettes (both: Eastman Kodak Corp., Rochester, MN, USA) for both systems. The soft-tissue and bony contrast and the brightness were evaluated and compared in a ranking of the two compared images. Different phantoms were irradiated to investigate the high- and low-contrast resolution. To account for quality assurance application, the short-time exposure of the unpacked and irradiated storage screen by green and red room lasers was also investigated. In general, the quality of the processed ACR images was slightly higher than that of the films, mostly due to cases of an insufficient exposure to the film. The storage screen was able to verify electron portals even for low electron energies with only minor photon contamination. The laser lines were sharply and clearly visible on the ACR images. The ACR system may replace the film without any noticeable decrease in image quality thereby reducing processing time and saving the costs of films and avoiding incorrect exposures.

  16. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  17. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  18. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  19. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    International Nuclear Information System (INIS)

    Locke, C.; Zavgorodni, S.; British Columbia Cancer Agency, Vancouver Island Center, Victoria BC

    2008-01-01

    Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam

  20. Verification of the Monte Carlo differential operator technique for MCNP trademark

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1996-02-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and second order terms of the Taylor series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Perturbation and sensitivity analyses can benefit from this technique in that predicted changes in one or more tally responses may be obtained for multiple perturbations in a single run. The user interface is intuitive, yet flexible enough to allow for changes in a specific microscopic cross section over a specified energy range. With this technique, a precise estimate of a small change in response is easily obtained, even when the standard deviation of the unperturbed tally is greater than the change. Furthermore, results presented in this report demonstrate that first and second order terms can offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  1. Runtime verification of embedded real-time systems.

    Science.gov (United States)

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  2. Verification of criticality safety in on-site spent fuel storage systems

    International Nuclear Information System (INIS)

    Rasmussen, R.W.

    1989-01-01

    On February 15, 1984, Duke Power Company received approval for a two-region, burnup credit, spent fuel storage rack design at both Units 1 and 2 of the McGuire Nuclear Station. Duke also hopes to obtain approval by January of 1990 for a dry spent fuel storage system at the Oconee Nuclear Station, which will incorporate the use of burnup credit in the criticality analysis governing the design of the individual storage units. While experiences in burnup verification for criticality safety for their dry storage system at Oconee are in the future, the methods proposed for burnup verification will be similar to those currently used at the McGuire Nuclear Station in the two-region storage racks installed in both pools. In conclusion, the primary benefit of the McGuire rerack effort has obviously been the amount of storage expansion it provided. A total increase of about 2,000 storage cells was realized, 1,000 of which were the result of pursuing the two-region rather than the conventional poison rack design. Less impacting, but equally as important, however, has been the experience gained during the planning, installation, and operation of these storage racks. This experience should prove useful for future rerack efforts likely to occur at Duke's Catawba Nuclear Station as well as for the current dry storage effort underway for the Oconee Nuclear Station

  3. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  4. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  5. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  6. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  7. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  8. SU-E-J-60: Efficient Monte Carlo Dose Calculation On CPU-GPU Heterogeneous Systems

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, K; Chen, D. Z; Hu, X. S [University of Notre Dame, Notre Dame, IN (United States); Zhou, B [Altera Corp., San Jose, CA (United States)

    2014-06-01

    Purpose: It is well-known that the performance of GPU-based Monte Carlo dose calculation implementations is bounded by memory bandwidth. One major cause of this bottleneck is the random memory writing patterns in dose deposition, which leads to several memory efficiency issues on GPU such as un-coalesced writing and atomic operations. We propose a new method to alleviate such issues on CPU-GPU heterogeneous systems, which achieves overall performance improvement for Monte Carlo dose calculation. Methods: Dose deposition is to accumulate dose into the voxels of a dose volume along the trajectories of radiation rays. Our idea is to partition this procedure into the following three steps, which are fine-tuned for CPU or GPU: (1) each GPU thread writes dose results with location information to a buffer on GPU memory, which achieves fully-coalesced and atomic-free memory transactions; (2) the dose results in the buffer are transferred to CPU memory; (3) the dose volume is constructed from the dose buffer on CPU. We organize the processing of all radiation rays into streams. Since the steps within a stream use different hardware resources (i.e., GPU, DMA, CPU), we can overlap the execution of these steps for different streams by pipelining. Results: We evaluated our method using a Monte Carlo Convolution Superposition (MCCS) program and tested our implementation for various clinical cases on a heterogeneous system containing an Intel i7 quad-core CPU and an NVIDIA TITAN GPU. Comparing with a straightforward MCCS implementation on the same system (using both CPU and GPU for radiation ray tracing), our method gained 2-5X speedup without losing dose calculation accuracy. Conclusion: The results show that our new method improves the effective memory bandwidth and overall performance for MCCS on the CPU-GPU systems. Our proposed method can also be applied to accelerate other Monte Carlo dose calculation approaches. This research was supported in part by NSF under Grants CCF

  9. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  10. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  11. The Cherenkov Telescope Array production system for Monte Carlo simulations and analysis

    Science.gov (United States)

    Arrabito, L.; Bernloehr, K.; Bregeon, J.; Cumani, P.; Hassan, T.; Haupt, A.; Maier, G.; Moralejo, A.; Neyroud, N.; pre="for the"> CTA Consortium, DIRAC Consortium,

    2017-10-01

    The Cherenkov Telescope Array (CTA), an array of many tens of Imaging Atmospheric Cherenkov Telescopes deployed on an unprecedented scale, is the next-generation instrument in the field of very high energy gamma-ray astronomy. An average data stream of about 0.9 GB/s for about 1300 hours of observation per year is expected, therefore resulting in 4 PB of raw data per year and a total of 27 PB/year, including archive and data processing. The start of CTA operation is foreseen in 2018 and it will last about 30 years. The installation of the first telescopes in the two selected locations (Paranal, Chile and La Palma, Spain) will start in 2017. In order to select the best site candidate to host CTA telescopes (in the Northern and in the Southern hemispheres), massive Monte Carlo simulations have been performed since 2012. Once the two sites have been selected, we have started new Monte Carlo simulations to determine the optimal array layout with respect to the obtained sensitivity. Taking into account that CTA may be finally composed of 7 different telescope types coming in 3 different sizes, many different combinations of telescope position and multiplicity as a function of the telescope type have been proposed. This last Monte Carlo campaign represented a huge computational effort, since several hundreds of telescope positions have been simulated, while for future instrument response function simulations, only the operating telescopes will be considered. In particular, during the last 18 months, about 2 PB of Monte Carlo data have been produced and processed with different analysis chains, with a corresponding overall CPU consumption of about 125 M HS06 hours. In these proceedings, we describe the employed computing model, based on the use of grid resources, as well as the production system setup, which relies on the DIRAC interware. Finally, we present the envisaged evolutions of the CTA production system for the off-line data processing during CTA operations and

  12. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  13. Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo

    International Nuclear Information System (INIS)

    Rodriguez Marrero, J. P.; Diaz Garcia, A.; Gomez Facenda, A.

    2015-01-01

    Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)

  14. The Physical Models and Statistical Procedures Used in the RACER Monte Carlo Code

    International Nuclear Information System (INIS)

    Sutton, T.M.; Brown, F.B.; Bischoff, F.G.; MacMillan, D.B.; Ellis, C.L.; Ward, J.T.; Ballinger, C.T.; Kelly, D.J.; Schindler, L.

    1999-01-01

    This report describes the MCV (Monte Carlo - Vectorized)Monte Carlo neutron transport code [Brown, 1982, 1983; Brown and Mendelson, 1984a]. MCV is a module in the RACER system of codes that is used for Monte Carlo reactor physics analysis. The MCV module contains all of the neutron transport and statistical analysis functions of the system, while other modules perform various input-related functions such as geometry description, material assignment, output edit specification, etc. MCV is very closely related to the 05R neutron Monte Carlo code [Irving et al., 1965] developed at Oak Ridge National Laboratory. 05R evolved into the 05RR module of the STEMB system, which was the forerunner of the RACER system. Much of the overall logic and physics treatment of 05RR has been retained and, indeed, the original verification of MCV was achieved through comparison with STEMB results. MCV has been designed to be very computationally efficient [Brown, 1981, Brown and Martin, 1984b; Brown, 1986]. It was originally programmed to make use of vector-computing architectures such as those of the CDC Cyber- 205 and Cray X-MP. MCV was the first full-scale production Monte Carlo code to effectively utilize vector-processing capabilities. Subsequently, MCV was modified to utilize both distributed-memory [Sutton and Brown, 1994] and shared memory parallelism. The code has been compiled and run on platforms ranging from 32-bit UNIX workstations to clusters of 64-bit vector-parallel supercomputers. The computational efficiency of the code allows the analyst to perform calculations using many more neutron histories than is practical with most other Monte Carlo codes, thereby yielding results with smaller statistical uncertainties. MCV also utilizes variance reduction techniques such as survival biasing, splitting, and rouletting to permit additional reduction in uncertainties. While a general-purpose neutron Monte Carlo code, MCV is optimized for reactor physics calculations. It has the

  15. Monte Carlo simulations of quantum systems on massively parallel supercomputers

    International Nuclear Information System (INIS)

    Ding, H.Q.

    1993-01-01

    A large class of quantum physics applications uses operator representations that are discrete integers by nature. This class includes magnetic properties of solids, interacting bosons modeling superfluids and Cooper pairs in superconductors, and Hubbard models for strongly correlated electrons systems. This kind of application typically uses integer data representations and the resulting algorithms are dominated entirely by integer operations. The authors implemented an efficient algorithm for one such application on the Intel Touchstone Delta and iPSC/860. The algorithm uses a multispin coding technique which allows significant data compactification and efficient vectorization of Monte Carlo updates. The algorithm regularly switches between two data decompositions, corresponding naturally to different Monte Carlo updating processes and observable measurements such that only nearest-neighbor communications are needed within a given decomposition. On 128 nodes of Intel Delta, this algorithm updates 183 million spins per second (compared to 21 million on CM-2 and 6.2 million on a Cray Y-MP). A systematic performance analysis shows a better than 90% efficiency in the parallel implementation

  16. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Y., E-mail: yican.wu@fds.org.cn [Inst. of Nuclear Energy Safety Technology, Hefei, Anhui (China)

    2015-07-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  17. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    International Nuclear Information System (INIS)

    Wu, Y.

    2015-01-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  18. Monte Carlo theory and practice

    International Nuclear Information System (INIS)

    James, F.

    1987-01-01

    Historically, the first large-scale calculations to make use of the Monte Carlo method were studies of neutron scattering and absorption, random processes for which it is quite natural to employ random numbers. Such calculations, a subset of Monte Carlo calculations, are known as direct simulation, since the 'hypothetical population' of the narrower definition above corresponds directly to the real population being studied. The Monte Carlo method may be applied wherever it is possible to establish equivalence between the desired result and the expected behaviour of a stochastic system. The problem to be solved may already be of a probabilistic or statistical nature, in which case its Monte Carlo formulation will usually be a straightforward simulation, or it may be of a deterministic or analytic nature, in which case an appropriate Monte Carlo formulation may require some imagination and may appear contrived or artificial. In any case, the suitability of the method chosen will depend on its mathematical properties and not on its superficial resemblance to the problem to be solved. The authors show how Monte Carlo techniques may be compared with other methods of solution of the same physical problem

  19. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  20. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  1. Monte Carlo design and simulation of a grid-type multi-layer pixel collimator for radiotherapy: feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae-Suk [The Catholic University of Korea, Seoul (Korea, Republic of)

    2014-05-15

    In order to confirm the possibility of field application of a different type collimator with a multileaf collimator (MLC), we constructed a grid-type multi-layer pixel collimator (GTPC) by using a Monte Carlo n-particle simulation (MCNPX). In this research, a number of factors related to the performance of the GPTC were evaluated using simulated output data of a basic MLC model. A layer was comprised of a 1024-pixel collimator (5.0 x 5.0 mm{sup 2}) which could operate individually as a grid-type collimator (32 x 32). A 30-layer collimator was constructed for a specific portal form to pass radiation through the opening and closing of each pixel cover. The radiation attenuation level and the leakage were compared between the GTPC modality simulation and MLC modeling (tungsten, 17.50 g/cm{sup 3}, 5.0 x 70.0 x 160.0 mm{sup 3}) currently used for a radiation field. Comparisons of the portal imaging, the lateral dose profile from a virtual water phantom, the dependence of the performance on the increase in the number of layers, the radiation intensity modulation verification, and the geometric error between the GTPC and the MLC were done using the MCNPX simulation data. From the simulation data, the intensity modulation of the GTPC showed a faster response than the MLC's (29.6%). In addition, the agreement between the doses that should be delivered to the target region was measured as 97.0%, and the GTPC system had an error below 0.01%, which is identical to that of MLC. A Monte Carlo simulation of the GTPC could be useful for verification of application possibilities. Because the line artifact is caused by the grid frame and the folded cover, a lineal dose transfer type is chosen for the operation of this system. However, the result of GTPC's performance showed that the methods of effective intensity modulation and the specific geometric beam shaping differed with the MLC modality.

  2. Monte Carlo design and simulation of a grid-type multi-layer pixel collimator for radiotherapy: Feasibility study

    Science.gov (United States)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-05-01

    In order to confirm the possibility of field application of a different type collimator with a multileaf collimator (MLC), we constructed a grid-type multi-layer pixel collimator (GTPC) by using a Monte Carlo n-particle simulation (MCNPX). In this research, a number of factors related to the performance of the GPTC were evaluated using simulated output data of a basic MLC model. A layer was comprised of a 1024-pixel collimator (5.0 × 5.0 mm2) which could operate individually as a grid-type collimator (32 × 32). A 30-layer collimator was constructed for a specific portal form to pass radiation through the opening and closing of each pixel cover. The radiation attenuation level and the leakage were compared between the GTPC modality simulation and MLC modeling (tungsten, 17.50 g/cm3, 5.0 × 70.0 × 160.0 mm3) currently used for a radiation field. Comparisons of the portal imaging, the lateral dose profile from a virtual water phantom, the dependence of the performance on the increase in the number of layers, the radiation intensity modulation verification, and the geometric error between the GTPC and the MLC were done using the MCNPX simulation data. From the simulation data, the intensity modulation of the GTPC showed a faster response than the MLC's (29.6%). In addition, the agreement between the doses that should be delivered to the target region was measured as 97.0%, and the GTPC system had an error below 0.01%, which is identical to that of MLC. A Monte Carlo simulation of the GTPC could be useful for verification of application possibilities. Because the line artifact is caused by the grid frame and the folded cover, a lineal dose transfer type is chosen for the operation of this system. However, the result of GTPC's performance showed that the methods of effective intensity modulation and the specific geometric beam shaping differed with the MLC modality.

  3. Monte Carlo design and simulation of a grid-type multi-layer pixel collimator for radiotherapy: feasibility study

    International Nuclear Information System (INIS)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae-Suk

    2014-01-01

    In order to confirm the possibility of field application of a different type collimator with a multileaf collimator (MLC), we constructed a grid-type multi-layer pixel collimator (GTPC) by using a Monte Carlo n-particle simulation (MCNPX). In this research, a number of factors related to the performance of the GPTC were evaluated using simulated output data of a basic MLC model. A layer was comprised of a 1024-pixel collimator (5.0 x 5.0 mm 2 ) which could operate individually as a grid-type collimator (32 x 32). A 30-layer collimator was constructed for a specific portal form to pass radiation through the opening and closing of each pixel cover. The radiation attenuation level and the leakage were compared between the GTPC modality simulation and MLC modeling (tungsten, 17.50 g/cm 3 , 5.0 x 70.0 x 160.0 mm 3 ) currently used for a radiation field. Comparisons of the portal imaging, the lateral dose profile from a virtual water phantom, the dependence of the performance on the increase in the number of layers, the radiation intensity modulation verification, and the geometric error between the GTPC and the MLC were done using the MCNPX simulation data. From the simulation data, the intensity modulation of the GTPC showed a faster response than the MLC's (29.6%). In addition, the agreement between the doses that should be delivered to the target region was measured as 97.0%, and the GTPC system had an error below 0.01%, which is identical to that of MLC. A Monte Carlo simulation of the GTPC could be useful for verification of application possibilities. Because the line artifact is caused by the grid frame and the folded cover, a lineal dose transfer type is chosen for the operation of this system. However, the result of GTPC's performance showed that the methods of effective intensity modulation and the specific geometric beam shaping differed with the MLC modality.

  4. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  5. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy.

    Science.gov (United States)

    Martinez-Rovira, I; Sempau, J; Prezado, Y

    2012-05-01

    Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two

  6. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  7. SU-G-JeP3-06: Lower KV Image Dose Are Expected From a Limited-Angle Intra-Fractional Verification (LIVE) System for SBRT Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Ding, G [Vanderbilt University Nashville, TN (United States); Yin, F; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2016-06-15

    Purpose: In order to track the tumor movement for patient positioning verification during arc treatment delivery or in between 3D/IMRT beams for stereotactic body radiation therapy (SBRT), the limited-angle kV projections acquisition simultaneously during arc treatment delivery or in-between static treatment beams as the gantry moves to the next beam angle was proposed. The purpose of this study is to estimate additional imaging dose resulting from multiple tomosynthesis acquisitions in-between static treatment beams and to compare with that of a conventional kV-CBCT acquisition. Methods: kV imaging system integrated into Varian TrueBeam accelerators was modeled using EGSnrc Monte Carlo user code, BEAMnrc and DOSXYZnrc code was used in dose calculations. The simulated realistic kV beams from the Varian TrueBeam OBI 1.5 system were used to calculate dose to patient based on CT images. Organ doses were analyzed using DVHs. The imaging dose to patient resulting from realistic multiple tomosynthesis acquisitions with each 25–30 degree kV source rotation between 6 treatment beam gantry angles was studied. Results: For a typical lung SBRT treatment delivery much lower (20–50%) kV imaging doses from the sum of realistic six tomosynthesis acquisitions with each 25–30 degree x-ray source rotation between six treatment beam gantry angles were observed compared to that from a single CBCT image acquisition. Conclusion: This work indicates that the kV imaging in this proposed Limited-angle Intra-fractional Verification (LIVE) System for SBRT Treatments has a negligible imaging dose increase. It is worth to note that the MV imaging dose caused by MV projection acquisition in-between static beams in LIVE can be minimized by restricting the imaging to the target region and reducing the number of projections acquired. For arc treatments, MV imaging acquisition in LIVE does not add additional imaging dose as the MV images are acquired from treatment beams directly during the

  8. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  9. Neutron point-flux calculation by Monte Carlo

    International Nuclear Information System (INIS)

    Eichhorn, M.

    1986-04-01

    A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)

  10. Simulation of quantum systems by the tomography Monte Carlo method

    International Nuclear Information System (INIS)

    Bogdanov, Yu I

    2007-01-01

    A new method of statistical simulation of quantum systems is presented which is based on the generation of data by the Monte Carlo method and their purposeful tomography with the energy minimisation. The numerical solution of the problem is based on the optimisation of the target functional providing a compromise between the maximisation of the statistical likelihood function and the energy minimisation. The method does not involve complicated and ill-posed multidimensional computational procedures and can be used to calculate the wave functions and energies of the ground and excited stationary sates of complex quantum systems. The applications of the method are illustrated. (fifth seminar in memory of d.n. klyshko)

  11. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  12. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  13. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  14. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Min, Moon Gi; Lee, Jae Ki; Lee, Kwang Hyun; Lee, Dong Il; Lim, Hee Taek [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2017-08-15

    Distributed Control System (DCS) communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units) must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs)

  15. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  16. Penelope-2006: a code system for Monte Carlo simulation of electron and photon transport

    International Nuclear Information System (INIS)

    2006-01-01

    The computer code system PENELOPE (version 2006) performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV to about 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A geometry package called PENGEOM permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the PENELOPE code system, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. These proceedings contain the corresponding manual and teaching notes of the PENELOPE-2006 workshop and training course, held on 4-7 July 2006 in Barcelona, Spain. (author)

  17. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  18. TU-FG-BRB-05: A 3 Dimensional Prompt Gamma Imaging System for Range Verification in Proton Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Draeger, E; Chen, H; Polf, J [University of Maryland School of Medicine, Baltimore, MD (United States); Mackin, D; Beddar, S [MD Anderson Cancer Center, Houston, TX (United States); Avery, S [University of Cape Town, Rondebosch (South Africa); Peterson, S

    2016-06-15

    Purpose: To report on the initial developments of a clinical 3-dimensional (3D) prompt gamma (PG) imaging system for proton radiotherapy range verification. Methods: The new imaging system under development consists of a prototype Compton camera to measure PG emission during proton beam irradiation and software to reconstruct, display, and analyze 3D images of the PG emission. For initial test of the system, PGs were measured with a prototype CC during a 200 cGy dose delivery with clinical proton pencil beams (ranging from 100 MeV – 200 MeV) to a water phantom. Measurements were also carried out with the CC placed 15 cm from the phantom for a full range 150 MeV pencil beam and with its range shifted by 2 mm. Reconstructed images of the PG emission were displayed by the clinical PG imaging software and compared to the dose distributions of the proton beams calculated by a commercial treatment planning system. Results: Measurements made with the new PG imaging system showed that a 3D image could be reconstructed from PGs measured during the delivery of 200 cGy of dose, and that shifts in the Bragg peak range of as little as 2 mm could be detected. Conclusion: Initial tests of a new PG imaging system show its potential to provide 3D imaging and range verification for proton radiotherapy. Based on these results, we have begun work to improve the system with the goal that images can be produced from delivery of as little as 20 cGy so that the system could be used for in-vivo proton beam range verification on a daily basis.

  19. Determinantal and worldline quantum Monte Carlo methods for many-body systems

    International Nuclear Information System (INIS)

    Vekic, M.; White, S.R.

    1993-01-01

    We examine three different quantum Monte Carlo methods for studying systems of interacting particles. The determinantal quantum Monte Carlo method is compared to two different worldline simulations. The first worldline method consists of a simulation carried out in the real-space basis, while the second method is implemented using as basis the eigenstates of the Hamiltonian on blocks of the two-dimensional lattice. We look, in particular, at the Hubbard model on a 4x4 lattice with periodic boundary conditions. The block method is superior to the real-space method in terms of the computational cost of the simulation, but shows a much worse negative sign problem. For larger values of U and away from half-filling it is found that the real-space method can provide results at lower temperatures than the determinantal method. We show that the sign problem in the block method can be slightly improved by an appropriate choice of basis

  20. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -networks that are independent at some degree. At this regard, we study how the division of a complex network into sub-networks, using stub elements to abstract all the routes that are common between sub-networks, may still guarantee compositionality of verification of safety properties....... for networks of large size due to the exponential computation time and resources needed. Some recent attempts to address this challenge adopt a compositional approach, targeted to track layouts that are easily decomposable into sub-networks such that a route is almost fully contained in a sub......-network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  1. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  2. System design and verification process for LHC programmable trigger electronics

    CERN Document Server

    Crosetto, D

    1999-01-01

    The rapid evolution of electronics has made it essential to design systems in a technology-independent form that will permit their realization in any future technology. This article describes two practical projects that have been developed for fast, programmable, scalable, modular electronics for the first-level trigger of Large Hadron Collider (LHC) experiments at CERN, Geneva. In both projects, one for the front-end electronics and the second for executing first- level trigger algorithms, the whole system requirements were constrained to two types of replicated components. The overall problem is described, the 3D-Flow design is introduced as a novel solution, and current solutions to the problem are described and compared with the 3D-Flow solution. The design/verification methodology proposed allows the user's real-time system algorithm to be verified down to the gate-level simulation on a technology- independent platform, thus yielding the design for a system that can be implemented with any technology at ...

  3. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  4. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  5. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  6. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  7. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    Sorenson, R.J.; Kaye, J.H.

    1974-03-01

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  8. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  9. Integrated verification and testing system (IVTS) for HAL/S programs

    Science.gov (United States)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  10. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  11. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  12. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  13. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  14. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  15. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    Science.gov (United States)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  16. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    International Nuclear Information System (INIS)

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Valvo, F; Fossati, P; Ciocca, M; Ferrari, A

    2015-01-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo ® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus ® chamber. An EBT3 ® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification. (paper)

  17. Dosimetric Uncertainties in Verification of Intensity Modulated Photon Beams

    International Nuclear Information System (INIS)

    Jurkovic, S.

    2010-01-01

    The doctoral thesis presents method for the calculation of the compensators' shape to modulate linear accelerators' beams. Characteristic of the method is more strict calculation of the scattered radiation in beams with an inhomogeneous cross-section than it was before. Method could be applied in various clinical situations. It's dosimetric verification was made in phantoms, measuring dose distributions using ionization chambers as well as radiographic film. Therefore, ionization chambers were used for the evaluation of modulator shape and film was used for the evaluation of two-dimensional dose distributions. It is well known that dosimetry of the intensity modulated photon beams is rather complicated regarding inhomogeneity of the dose distribution. The main reason for that is the beam modulator which changes spectral distribution of the beam. Possibility of use different types of detectors for the measurements of dose distributions in modulated photon beams and their accuracy were examined. Small volume ionization chambers, different diodes and amorphus silicon detector and radigraphic film were used. Measured dose distributions were compared between each other as well as with distributions simulated using Monte Carlo particle transport algorithm. In this way the most accurate method for the verification of modulate photon beams is suggested. (author)

  18. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons

    2009-07-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  19. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  20. Monte Carlo - Advances and Challenges

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.

    2008-01-01

    Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature