WorldWideScience

Sample records for carlo verification system

  1. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  2. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    Directory of Open Access Journals (Sweden)

    Iraj Jabbari

    2015-01-01

    Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  3. An integrated Monte Carlo dosimetric verification system for radiotherapy treatment planning

    Science.gov (United States)

    Yamamoto, T.; Mizowaki, T.; Miyabe, Y.; Takegawa, H.; Narita, Y.; Yano, S.; Nagata, Y.; Teshima, T.; Hiraoka, M.

    2007-04-01

    An integrated Monte Carlo (MC) dose calculation system, MCRTV (Monte Carlo for radiotherapy treatment plan verification), has been developed for clinical treatment plan verification, especially for routine quality assurance (QA) of intensity-modulated radiotherapy (IMRT) plans. The MCRTV system consists of the EGS4/PRESTA MC codes originally written for particle transport through the accelerator, the multileaf collimator (MLC), and the patient/phantom, which run on a 28-CPU Linux cluster, and the associated software developed for the clinical implementation. MCRTV has an interface with a commercial treatment planning system (TPS) (Eclipse, Varian Medical Systems, Palo Alto, CA, USA) and reads the information needed for MC computation transferred in DICOM-RT format. The key features of MCRTV have been presented in detail in this paper. The phase-space data of our 15 MV photon beam from a Varian Clinac 2300C/D have been developed and several benchmarks have been performed under homogeneous and several inhomogeneous conditions (including water, aluminium, lung and bone media). The MC results agreed with the ionization chamber measurements to within 1% and 2% for homogeneous and inhomogeneous conditions, respectively. The MC calculation for a clinical prostate IMRT treatment plan validated the implementation of the beams and the patient/phantom configuration in MCRTV.

  4. Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system

    Science.gov (United States)

    Ma, C.-M.; Pawlicki, T.; Jiang, S. B.; Li, J. S.; Deng, J.; Mok, E.; Kapur, A.; Xing, L.; Ma, L.; Boyer, A. L.

    2000-09-01

    The purpose of this work was to use Monte Carlo simulations to verify the accuracy of the dose distributions from a commercial treatment planning optimization system (Corvus, Nomos Corp., Sewickley, PA) for intensity-modulated radiotherapy (IMRT). A Monte Carlo treatment planning system has been implemented clinically to improve and verify the accuracy of radiotherapy dose calculations. Further modifications to the system were made to compute the dose in a patient for multiple fixed-gantry IMRT fields. The dose distributions in the experimental phantoms and in the patients were calculated and used to verify the optimized treatment plans generated by the Corvus system. The Monte Carlo calculated IMRT dose distributions agreed with the measurements to within 2% of the maximum dose for all the beam energies and field sizes for both the homogeneous and heterogeneous phantoms. The dose distributions predicted by the Corvus system, which employs a finite-size pencil beam (FSPB) algorithm, agreed with the Monte Carlo simulations and measurements to within 4% in a cylindrical water phantom with various hypothetical target shapes. Discrepancies of more than 5% (relative to the prescribed target dose) in the target region and over 20% in the critical structures were found in some IMRT patient calculations. The FSPB algorithm as implemented in the Corvus system is adequate for homogeneous phantoms (such as prostate) but may result in significant under- or over-estimation of the dose in some cases involving heterogeneities such as the air-tissue, lung-tissue and tissue-bone interfaces.

  5. Development of a Monte Carlo model for treatment planning dose verification of the Leksell Gamma Knife Perfexion radiosurgery system.

    Science.gov (United States)

    Yuan, Jiankui; Lo, Simon S; Zheng, Yiran; Sohn, Jason W; Sloan, Andrew E; Ellis, Rodney; Machtay, Mitchell; Wessels, Barry

    2016-01-01

    Detailed Monte Carlo (MC) modeling of the Leksell Gamma Knife (GK) Perfexion (PFX) collimator system is the only accurate ab initio approach appearing in the literature. As a different approach, in this work, we present a MC model based on film measurement. By adjusting the model parameters and fine-tuning the derived fluence map for each individual source to match the manufacturer's ring output factors, we created a reasonable virtual source model for MC simulations to verify treatment planning dose for the GK PFX radiosurgery system. The MC simulation model was commissioned by simple single shots. Dose profiles and both ring and collimator output factors were compared with the treatment planning system (TPS). Good agreement was achieved for dose profiles especially for the region of plateau (< 2%), while larger difference (< 5%) came from the penumbra region. The maximum difference of the calculated output factor was within 0.7%. The model was further validated by a clinical test case. Good agreement was obtained. The DVHs for brainstem and the skull were almost identical and, for the target, the volume covered by the prescription (12.5 Gy to 50% isodose line) was 95.6% from MC calculation versus 100% from the TPS. PMID:27455497

  6. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  7. Verification of Monte Carlo transport codes FLUKA, Mars and Shield

    International Nuclear Information System (INIS)

    The present study is a continuation of the project 'Verification of Monte Carlo Transport Codes' which is running at GSI as a part of activation studies of FAIR relevant materials. It includes two parts: verification of stopping modules of FLUKA, MARS and SHIELD-A (with ATIMA stopping module) and verification of their isotope production modules. The first part is based on the measurements of energy deposition function of uranium ions in copper and stainless steel. The irradiation was done at 500 MeV/u and 950 MeV/u, the experiment was held at GSI from September 2004 until May 2005. The second part is based on gamma-activation studies of an aluminium target irradiated with an argon beam of 500 MeV/u in August 2009. Experimental depth profiling of the residual activity of the target is compared with the simulations. (authors)

  8. Simulation of digital pixel readout chip architectures with the RD53 SystemVerilog-UVM verification environment using Monte Carlo physics data

    International Nuclear Information System (INIS)

    The simulation and verification framework developed by the RD53 collaboration is a powerful tool for global architecture optimization and design verification of next generation hybrid pixel readout chips. In this paper the framework is used for studying digital pixel chip architectures at behavioral level. This is carried out by simulating a dedicated, highly parameterized pixel chip description, which makes it possible to investigate different grouping strategies between pixels and different latency buffering and arbitration schemes. The pixel hit information used as simulation input can be either generated internally in the framework or imported from external Monte Carlo detector simulation data. The latter have been provided by both the CMS and ATLAS experiments, featuring HL-LHC operating conditions and the specifications related to the Phase 2 upgrade. Pixel regions and double columns were simulated using such Monte Carlo data as inputs: the performance of different latency buffering architectures was compared and the compliance of different link speeds with the expected column data rate was verified

  9. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  10. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy.

    Science.gov (United States)

    Lima, Thiago V M; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient's treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers' measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta threshold

  11. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  12. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy

    Science.gov (United States)

    Lima, Thiago V. M.; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient’s treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers’ measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference – p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta

  13. Distorted Fingerprint Verification System

    OpenAIRE

    Divya KARTHIKAESHWARAN; Jeyalatha SIVARAMAKRISHNAN

    2011-01-01

    Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the...

  14. Verification of Monte Carlo transport codes by activation experiments

    International Nuclear Information System (INIS)

    With the increasing energies and intensities of heavy-ion accelerator facilities, the problem of an excessive activation of the accelerator components caused by beam losses becomes more and more important. Numerical experiments using Monte Carlo transport codes are performed in order to assess the levels of activation. The heavy-ion versions of the codes were released approximately a decade ago, therefore the verification is needed to be sure that they give reasonable results. Present work is focused on obtaining the experimental data on activation of the targets by heavy-ion beams. Several experiments were performed at GSI Helmholtzzentrum fuer Schwerionenforschung. The interaction of nitrogen, argon and uranium beams with aluminum targets, as well as interaction of nitrogen and argon beams with copper targets was studied. After the irradiation of the targets by different ion beams from the SIS18 synchrotron at GSI, the γ-spectroscopy analysis was done: the γ-spectra of the residual activity were measured, the radioactive nuclides were identified, their amount and depth distribution were detected. The obtained experimental results were compared with the results of the Monte Carlo simulations using FLUKA, MARS and SHIELD. The discrepancies and agreements between experiment and simulations are pointed out. The origin of discrepancies is discussed. Obtained results allow for a better verification of the Monte Carlo transport codes, and also provide information for their further development. The necessity of the activation studies for accelerator applications is discussed. The limits of applicability of the heavy-ion beam-loss criteria were studied using the FLUKA code. FLUKA-simulations were done to determine the most preferable from the radiation protection point of view materials for use in accelerator components.

  15. Generalized coordinate transformations for Monte Carlo (DOSXYZnrc and VMC++) verifications of DICOM compatible radiotherapy treatment plans

    CERN Document Server

    Schmitz, Richard M; Townson, Reid W; Zavgorodni, Sergei

    2014-01-01

    The International Electrotechnical Commission (IEC) has previously defined standard rotation operators for positive gantry, collimator and couch rotations for the radiotherapy DICOM coordinate system that is commonly used by treatment planning systems. Coordinate transformations to the coordinate systems of commonly used Monte Carlo (MC) codes (BEAMnrc/DOSXYZnrc and VMC++) have been derived and published in the literature. However, these coordinate transformations disregard patient orientation during the computed tomography (CT) scan, and assume the most commonly used 'head first, supine' orientation. While less common, other patient orientations are used in clinics - Monte Carlo verification of such treatments can be problematic due to the lack of appropriate coordinate transformations. In this work, a solution has been obtained by correcting the CT-derived phantom orientation and deriving generalized coordinate transformations for field angles in the DOSXYZnrc and VMC++ codes. The rotation operator that inc...

  16. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  17. IMRT treatment Monitor Unit verification using absolute calibrated BEAMnrc and Geant4 Monte Carlo simulations

    International Nuclear Information System (INIS)

    Intensity Modulated Radiation Therapy (IMRT) treatments are some of the most complex being delivered by modern megavoltage radiotherapy accelerators. Therefore verification of the dose, or the presecribed Monitor Units (MU), predicted by the planning system is a key element to ensuring that patients should receive an accurate radiation dose plan during IMRT. One inherently accurate method is by comparison with absolute calibrated Monte Carlo simulations of the IMRT delivery by the linac head and corresponding delivery of the plan to a patient based phantom. In this work this approach has been taken using BEAMnrc for simulation of the treatment head, and both DOSXYZnrc and Geant4 for the phantom dose calculation. The two Monte Carlo codes agreed to within 1% of each other, and these matched very well to our planning system for IMRT plans to the brain, nasopharynx, and head and neck.

  18. Pre-treatment radiotherapy dose verification using Monte Carlo doselet modulation in a spherical phantom

    CERN Document Server

    Townson, Reid W

    2013-01-01

    Due to the increasing complexity of radiotherapy delivery, accurate dose verification has become an essential part of the clinical treatment process. The purpose of this work was to develop an electronic portal image (EPI) based pre-treatment verification technique capable of quickly reconstructing 3D dose distributions from both coplanar and non-coplanar treatments. The dose reconstruction is performed in a spherical water phantom by modulating, based on EPID measurements, pre-calculated Monte Carlo (MC) doselets defined on a spherical coordinate system. This is called the spherical doselet modulation (SDM) method. This technique essentially eliminates the statistical uncertainty of the MC dose calculations by exploiting both azimuthal symmetry in a patient-independent phase-space and symmetry of a virtual spherical water phantom. The symmetry also allows the number of doselets necessary for dose reconstruction to be reduced by a factor of about 250. In this work, 51 doselets were used. The SDM method mitiga...

  19. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  20. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due to the...

  1. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  2. Measurement comparison and Monte Carlo analysis for volumetric-modulated arc therapy (VMAT) delivery verification using the ArcCHECK dosimetry system.

    Science.gov (United States)

    Lin, Mu-Han; Koren, Sion; Veltchev, Iavor; Li, Jinsheng; Wang, Lu; Price, Robert A; Ma, C-M

    2013-01-01

    The objective of this study is to validate the capabilities of a cylindrical diode array system for volumetric-modulated arc therapy (VMAT) treatment quality assurance (QA). The VMAT plans were generated by the Eclipse treatment planning system (TPS) with the analytical anisotropic algorithm (AAA) for dose calculation. An in-house Monte Carlo (MC) code was utilized as a validation tool for the TPS calculations and the ArcCHECK measurements. The megavoltage computed tomography (MVCT) of the ArcCHECK system was adopted for the geometry reconstruction in the TPS and for MC simulations. A 10 × 10 cm2 open field validation was performed for both the 6 and 10 MV photon beams to validate the absolute dose calibration of the ArcCHECK system and also the TPS dose calculations for this system. The impact of the angular dependency on noncoplanar deliveries was investigated with a series of 10 × 10 cm2 fields delivered with couch rotation 0° to 40°. The sensitivity of detecting the translational (1 to 10 mm) and the rotational (1° to 3°) misalignments was tested with a breast VMAT case. Ten VMAT plans (six prostate, H&N, pelvis, liver, and breast) were investigated to evaluate the agreement of the target dose and the peripheral dose among ArcCHECK measurements, and TPS and MC dose calculations. A customized acrylic plug holding an ion chamber was used to measure the dose at the center of the ArcCHECK phantom. Both the entrance and the exit doses measured by the ArcCHECK system with and without the plug agreed with the MC simulation to 1.0%. The TPS dose calculation with a 2.5 mm grid overestimated the exit dose by up to 7.2% when the plug was removed. The agreement between the MC and TPS calculations for the ArcCHECK without the plug improved significantly when a 1 mm dose calculation grid was used in the TPS. The noncoplanar delivery test demonstrated that the angular dependency has limited impact on the gamma passing rate (< 1.2% drop) for the 2%-3% dose and 2mm-3 mm

  3. On Verification Modelling of Embedded Systems

    OpenAIRE

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verification of any non-trivial system. Good verification models, therefore, are lean and mean, and cannot be obtained easily or generated automatically. Current research, however, seems to take the construct...

  4. Enumeration Verification System (EVS)

    Data.gov (United States)

    Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...

  5. SU-E-T-578: MCEBRT, A Monte Carlo Code for External Beam Treatment Plan Verifications

    Energy Technology Data Exchange (ETDEWEB)

    Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States); Eldib, A [Fox Chase Cancer Center, Philadelphia, PA (United States); Al-Azhar University, Cairo (Egypt)

    2014-06-01

    Purpose: Present a new Monte Carlo code (MCEBRT) for patient-specific dose calculations in external beam radiotherapy. The code MLC model is benchmarked and real patient plans are re-calculated using MCEBRT and compared with commercial TPS. Methods: MCEBRT is based on the GEPTS system (Med. Phys. 29 (2002) 835–846). Phase space data generated for Varian linac photon beams (6 – 15 MV) are used as source term. MCEBRT uses a realistic MLC model (tongue and groove, rounded ends). Patient CT and DICOM RT files are used to generate a 3D patient phantom and simulate the treatment configuration (gantry, collimator and couch angles; jaw positions; MLC sequences; MUs). MCEBRT dose distributions and DVHs are compared with those from TPS in absolute way (Gy). Results: Calculations based on the developed MLC model closely matches transmission measurements (pin-point ionization chamber at selected positions and film for lateral dose profile). See Fig.1. Dose calculations for two clinical cases (whole brain irradiation with opposed beams and lung case with eight fields) are carried out and outcomes are compared with the Eclipse AAA algorithm. Good agreement is observed for the brain case (Figs 2-3) except at the surface where MCEBRT dose can be higher by 20%. This is due to better modeling of electron contamination by MCEBRT. For the lung case an overall good agreement (91% gamma index passing rate with 3%/3mm DTA criterion) is observed (Fig.4) but dose in lung can be over-estimated by up to 10% by AAA (Fig.5). CTV and PTV DVHs from TPS and MCEBRT are nevertheless close (Fig.6). Conclusion: A new Monte Carlo code is developed for plan verification. Contrary to phantombased QA measurements, MCEBRT simulate the exact patient geometry and tissue composition. MCEBRT can be used as extra verification layer for plans where surface dose and tissue heterogeneity are an issue.

  6. Central Verification System

    Data.gov (United States)

    US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...

  7. Simple dose verification system for radiotherapy radiation

    International Nuclear Information System (INIS)

    The aim of this paper is to investigate an accurate and convenient quality assurance programme that should be included in the dosimetry system of the radiotherapy level radiation. We designed a mailed solid phantom and used TLD-100 chips and a Rexon UL320 reader for the purpose of dosimetry quality assurance in Taiwanese radiotherapy centers. After being assembled, the solid polystyrene phantom weighted only 375 g which was suitable for mailing. The Monte Carlo BEAMnrc code was applied in calculations of the dose conversion factor of water and polystyrene phantom: the dose conversion factor measurements were obtained by switching the TLDs at the same calibration depth of water and the solid phantom to measure the absorbed dose and verify the accuracy of the theoretical calculation results. The experimental results showed that the dose conversion factors from TLD measurements and the calculation values from the BEAMnrc were in good agreement with a difference within 0.5%. Ten radiotherapy centers were instructed to deliver to the TLDs on central beam axis absorbed dose of 2 Gy. The measured doses were compared with the planned ones. A total of 21 beams were checked. The dose verification differences under reference conditions for 60Co, high energy X-rays of 6, 10 and 15 MV were truly within 4% and that proved the feasibility of applying the method suggested in this work in radiotherapy dose verification

  8. Verification of SMART Neutronics Design Methodology by the MCNAP Monte Carlo Code

    International Nuclear Information System (INIS)

    SMART is a small advanced integral pressurized water reactor (PWR) of 330 MW(thermal) designed for both electricity generation and seawater desalinization. The CASMO-3/MASTER nuclear analysis system, a design-basis of Korean PWR plants, has been employed for the SMART core nuclear design and analysis because the fuel assembly (FA) characteristics and reactor operating conditions in temperature and pressure are similar to those of PWR plants. However, the SMART FAs are highly poisoned with more than 20 Al2O3-B4C plus additional Gd2O3/UO2 BPRs each FA. The reactor is operated with control rods inserted. Therefore, the flux and power distribution may become more distorted than those of commercial PWR plants. In addition, SMART should produce power from room temperature to hot-power operating condition because it employs nuclear heating from room temperature. This demands reliable predictions of core criticality, shutdown margin, control rod worth, power distributions, and reactivity coefficients at both room temperature and hot operating condition, yet no such data are available to verify the CASMO-3/MASTER (hereafter MASTER) code system. In the absence of experimental verification data for the SMART neutronics design, the Monte Carlo depletion analysis program MCNAP is adopted as near-term alternatives for qualifying MASTER neutronics design calculations. The MCNAP is a personal computer-based continuous energy Monte Carlo neutronics analysis program written in C++ language. We established its qualification by presenting its prediction accuracy on measurements of Venus critical facilities and core neutronics analysis of a PWR plant in operation, and depletion characteristics of integral burnable absorber FAs of the current PWR. Here, we present a comparison of MASTER and MCNAP neutronics design calculations for SMART and establish the qualification of the MASTER system

  9. Nanodosimetric verification in proton therapy: Monte Carlo Codes Comparison

    International Nuclear Information System (INIS)

    Full text: Nanodosimetry strives to develop a novel dosimetry concept suitable for advanced modalities of cancer radiotherapy, such as proton therapy. This project aims to evaluate the plausibility of the physical models implemented in the Geant4 Very Low Energy (Geant4-DNA) extensions by comparing nanodosimetric quantities calculated with Geant4-DNA and the PTB Monte Carlo track structure code. Nanodosimetric track structure parameters were calculated for cylindrical targets representing DNA and nucleosome segments and converted into the probability of producing a DSB using the model proposed by Garty et al. [1]. Monoenergetic protons and electrons of energies typical for 6-electron spectra were considered as primary particles. Good agreement was found between the two codes for electrons of energies above 200 eV. Below this energy Geant4-DNA produced slightly higher numbers of ionisations in the sensitive volumes and higher probabilities for DSB formation. For protons, Geant4-DNA also gave higher numbers of ionisations and DSB probabilities, particularly in the low energy range, while a satisfactory agreement was found for energies higher than I MeV. Comparing two codes can be useful as any observed divergence in results between the two codes provides valuable information as to where further consideration of the underlying physical models used in each code may be required. Consistently it was seen that the largest difference between the codes was in the low energy ranges for each particle type. (author)

  10. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  11. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2013-01-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based ReusableVerification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tightschedules on all projects it is important to have a strong verification methodology which contributes toFirst Silicon Success. Deploy methodologies which enforce full functional coverage and verification ofcorner cases through pseudo random test scenarios is required. Also, standardization of verification flow isneeded. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC LevelVerification was used for Functional Verification. Different Verification Environments were used at IPlevel and SoC level. Different Verification/Validation Methodologies were used for SoC Verification acrossmultiple sites. Verification teams were also looking for the ways how to catch bugs early in the designcycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based ReusableVerification Environment is required to avoid the problem of having so many methodologies and provides astandard unified solution which compiles on all tools.

  12. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  13. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  14. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  15. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    International Nuclear Information System (INIS)

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  16. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Mustafin, Edil; Strasik, Ivan [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Ratzinger, Ulrich [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); Latysheva, Ludmila; Sobolevskiy, Nikolai [Institute for Nuclear Research RAS, Moscow (Russian Federation)

    2011-07-01

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  17. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  18. Code Formal Verification of Operation System

    OpenAIRE

    Yu Zhang; Yunwei Dong; Huo Hong; Fan Zhang

    2010-01-01

    with the increasing pressure on non-function attributes (security, safety and reliability) requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operatio...

  19. An analytical solution to a simplified EDXRF model for Monte Carlo code verification

    International Nuclear Information System (INIS)

    The objective of this study is to obtain an analytical solution to the scalar photon transport equation that can be used to obtain benchmark results for the verification of energy dispersive X-Ray fluorescence (EDXRF) Monte Carlo simulation codes. The multi-collided flux method (multiple scattering method) is implemented to obtain analytical expressions for the space-, energy-, and angle-dependent scalar photon flux for a one dimensional EDXRF model problem. In order to obtain benchmark results, higher-order multiple scattering terms are included in the multi-collided flux method. The details of the analytical solution and of the proposed EDXRF model problem are presented. Analytical expressions obtained are then used to calculate the energy-dependent current. The analytically-calculated energy-dependent current is compared with Monte Carlo code results. The findings of this study show that analytical solutions to the scalar photon transport equation with the proposed model problem can be used as a verification tool in EDXRF Monte Carlo code development.

  20. Automated verification of system configuration

    Science.gov (United States)

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  1. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    Science.gov (United States)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  2. XSBench. The development and verification of a performance abstraction for Monte Carlo reactor analysis

    International Nuclear Information System (INIS)

    We isolate the most computationally expensive steps of a robust nuclear reactor core Monte Carlo particle transport simulation. The hot kernel is then abstracted into a simplified proxy application, designed to mimic the key performance characteristics of the full application. A series of performance verification tests and analyses are carried out to investigate the low-level performance parameters of both the simplified kernel and the full application. The kernel's performance profile is found to closely match that of the application, making it a convenient test bed for performance analyses on cutting edge platforms and experimental next-generation high performance computing architectures. (author)

  3. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  4. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these...... approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  5. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  6. Online Fingerprint Verification Algorithm and Distributed System

    OpenAIRE

    Xi Guo; Jyotirmay Gadedadikar; Ping Zhang

    2011-01-01

    In this paper, a novel online fingerprint verification algorithm and distribution system is proposed. In the beginning, fingerprint acquisition, image preprocessing, and feature extraction are conducted on workstations. Then, the extracted feature is transmitted over the internet. Finally, fingerprint verification is processed on a server through web-based database query. For the fingerprint feature extraction, a template is imposed on the fingerprint image to calculate the type and direction...

  7. Code Formal Verification of Operation System

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2010-12-01

    Full Text Available with the increasing pressure on non-function attributes (security, safety and reliability requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operation system kernel in C code level. We present a case study to the verification of real-world C systems code derived from an implementation of μC/OS – II in the end.

  8. Attribute Verification Systems: Concepts and Status

    International Nuclear Information System (INIS)

    Verification of the characteristics of large pieces of nuclear material is relatively straightforward if that material is not classified. However, this type of radiation measurement is, almost by definition, very intrusive. An alternative is to measure selected attributes of the material; an attribute of an object is an unclassified characteristic (e.g. exceeding a negotiated mass threshold) that is related to a classified quantity (e.g., the mass of an object). Such an attribute verification system must meet two criteria: 1) classified information cannot be released to the inspecting party, and 2) the inspecting party must be able to reach credible and independent conclusions. The attribute verification system for use in international agreements must satisfy both requirements simultaneously, to the satisfaction of all parties concerned. One key point in the design of such systems is that while the measurement data itself may be classified, the measurement system cannot be. A specific example of a 'three attribute' verification system is the 'Attribute Verification System with Information Barrier for Plutonium with Classified Characteristics utilizing Neutron Multiplicity Counting and High-Resolution Gamma-Ray Spectrometry' (or AVNG), which is currently being designed and built as part of ongoing cooperation within the trilateral format (IAEA, Russia, and USA)

  9. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification of...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... number of case studies, tackled using a prototypical implementation....

  10. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    International Nuclear Information System (INIS)

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  11. Integrating automated verification into interactive systems development

    OpenAIRE

    Campos, J. Creissac

    1998-01-01

    Our field of research is the application of automated reasoning techniques during interactor based interactive systems development. The aim being to ensure that the developed systems embody appropriate properties and principles. In this report we identify some of the pitfalls of current approaches and propose a new way to integrate verification into interactive systems development.

  12. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  13. Fuel Retrieval System Design Verification Report

    International Nuclear Information System (INIS)

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000)

  14. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587. ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid systems * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  15. Verification of Autonomous Systems for Space Applications

    Science.gov (United States)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  16. Evaluating software verification systems: benchmarks and competitions

    NARCIS (Netherlands)

    Beyer, Dirk; Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary

    2014-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14171 “Evaluating Software Verification Systems: Benchmarks and Competitions”. The seminar brought together a large group of current and future competition organizers and participants, benchmark maintainers, as well as practition

  17. The Spruce System: quality verification of Linux file systems drivers

    OpenAIRE

    Karen, Tsirunyan; Vahram, Martirosyan; Andrey, Tsyvarev

    2012-01-01

    This paper is dedicated to the problem of dynamic verification of Linux file system drivers. Alongside with some existing solutions, the Spruce system is presented, which is dedicated to verification of drivers of certain Linux file systems. This system is being developed in the System Programming Laboratory of Russian-Armenian (Slavonic) University in Armenia. Spruce provides a large variety of tests for file system drivers. These tests help not only verify the file system functionality, but...

  18. Establishment of verification system for solid waste

    International Nuclear Information System (INIS)

    Solid wastes generated from MOX Facility have to be verified as same as nuclear fuel materials according to the IAEA safeguards criteria. On the other hand, from storing efficiency point of view, solid waste drums must be piled up (3 layers). However, it was very difficult to take out the drums randomly selected for verification of piled up drums. So it was necessary to develop new verification system which measures the selected drum easily and speedily without moving it. The system measuring the waste drum directly in narrow space of pallet for forklift-nails. This system consists of NaI(Tl) detector, collimator with wheels, PMCA (Portable Multichannel Analyzer), rails and cables. This system can confirm existence of Pu in drums by counting γ-Ray of Pu-241 (208 keV). This system is very small and light because of easy operating in narrow space and high position. (author)

  19. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... probabilistic hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based on...... abstractions computed by tools for the analysis of non-probabilistic hybrid systems, improvements in effectivity of such tools directly carry over to improvements in effectivity of the technique we describe. We demonstrate the applicability of our approach on a number of case studies, tackled using a...

  20. National Verification System of National Meteorological Center , China

    Science.gov (United States)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  1. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  2. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  3. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  4. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  5. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  6. Parametric Verification of Weighted Systems

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders;

    2015-01-01

    This paper addresses the problem of parametric model checking for weighted transition systems. We consider transition systems labelled with linear equations over a set of parameters and we use them to provide semantics for a parametric version of weighted CTL where the until and next operators are...... themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose a...

  7. Dosimetric accuracy of a deterministic radiation transport based 192Ir brachytherapy treatment planning system. Part II: Monte Carlo and experimental verification of a multiple source dwell position plan employing a shielded applicator

    International Nuclear Information System (INIS)

    Purpose: The aim of this work is the dosimetric validation of a deterministic radiation transport based treatment planning system (BRACHYVISION v. 8.8, referred to as TPS in the following) for multiple 192Ir source dwell position brachytherapy applications employing a shielded applicator in homogeneous water geometries. Methods: TPS calculations for an irradiation plan employing seven VS2000 192Ir high dose rate (HDR) source dwell positions and a partially shielded applicator (GM11004380) were compared to corresponding Monte Carlo (MC) simulation results, as well as experimental results obtained using the VIP polymer gel-magnetic resonance imaging three-dimensional dosimetry method with a custom made phantom. Results: TPS and MC dose distributions were found in agreement which is mainly within ±2%. Considerable differences between TPS and MC results (greater than 2%) were observed at points in the penumbra of the shields (i.e., close to the edges of the ''shielded'' segment of the geometries). These differences were experimentally verified and therefore attributed to the TPS. Apart from these regions, experimental and TPS dose distributions were found in agreement within 2 mm distance to agreement and 5% dose difference criteria. As shown in this work, these results mark a significant improvement relative to dosimetry algorithms that disregard the presence of the shielded applicator since the use of the latter leads to dosimetry errors on the order of 20%-30% at the edge of the ''unshielded'' segment of the geometry and even 2%-6% at points corresponding to the potential location of the target volume in clinical applications using the applicator (points in the unshielded segment at short distances from the applicator). Conclusions: Results of this work attest the capability of the TPS to accurately account for the scatter conditions and the increased attenuation involved in HDR brachytherapy applications employing multiple source dwell positions and partially

  8. Dosimetric accuracy of a deterministic radiation transport based {sup 192}Ir brachytherapy treatment planning system. Part II: Monte Carlo and experimental verification of a multiple source dwell position plan employing a shielded applicator

    Energy Technology Data Exchange (ETDEWEB)

    Petrokokkinos, L.; Zourari, K.; Pantelis, E.; Moutsatsos, A.; Karaiskos, P.; Sakelliou, L.; Seimenis, I.; Georgiou, E.; Papagiannis, P. [Medical Physics Laboratory, Medical School, University of Athens, 75 Mikras Asias, 115 27 Athens (Greece); Department of Physics, Nuclear and Particle Physics Section, University of Athens, Panepistimioupolis, Ilisia, 157 71 Athens (Greece); Medical Physics Laboratory, Medical School, Democritus University of Thrace, 2nd Building of Preclinical Section, University Campus, Alexandroupolis 68100 (Greece); Medical Physics Laboratory, Medical School, University of Athens, 75 Mikras Asias, 115 27 Athens (Greece)

    2011-04-15

    Purpose: The aim of this work is the dosimetric validation of a deterministic radiation transport based treatment planning system (BRACHYVISION v. 8.8, referred to as TPS in the following) for multiple {sup 192}Ir source dwell position brachytherapy applications employing a shielded applicator in homogeneous water geometries. Methods: TPS calculations for an irradiation plan employing seven VS2000 {sup 192}Ir high dose rate (HDR) source dwell positions and a partially shielded applicator (GM11004380) were compared to corresponding Monte Carlo (MC) simulation results, as well as experimental results obtained using the VIP polymer gel-magnetic resonance imaging three-dimensional dosimetry method with a custom made phantom. Results: TPS and MC dose distributions were found in agreement which is mainly within {+-}2%. Considerable differences between TPS and MC results (greater than 2%) were observed at points in the penumbra of the shields (i.e., close to the edges of the ''shielded'' segment of the geometries). These differences were experimentally verified and therefore attributed to the TPS. Apart from these regions, experimental and TPS dose distributions were found in agreement within 2 mm distance to agreement and 5% dose difference criteria. As shown in this work, these results mark a significant improvement relative to dosimetry algorithms that disregard the presence of the shielded applicator since the use of the latter leads to dosimetry errors on the order of 20%-30% at the edge of the ''unshielded'' segment of the geometry and even 2%-6% at points corresponding to the potential location of the target volume in clinical applications using the applicator (points in the unshielded segment at short distances from the applicator). Conclusions: Results of this work attest the capability of the TPS to accurately account for the scatter conditions and the increased attenuation involved in HDR brachytherapy applications

  9. Radiation treatment planning system verification

    International Nuclear Information System (INIS)

    Optimum radiotherapy requires accurate and consistent radiation doses. To fulfil this requirement, it is necessary to make quality checks of the equipment and software included in the planning process. Treatment planning system is used to calculate monitor units required to deliver prescribed dose to a designated volume with acceptable distribution of radiation dose. The aim of this study was to verify the Theraplan Plus treatment program used in our Department to calculate treatment times for radiation therapy with 60Co unit. To run a Theraplan Plus system, it is necessary to input data describing mechanical and radiation aspects of treatment unit. One of the checks included a comparison of the measured depth doses and off-axis ratios with those calculated using the treatment program. The second step included the measurement of the dose using ionisation chamber and thermoluminescent dosimeters (TLD), which was then compared with calculated values for several treatment scenarios (central axis dose on specified depth of square fields, elongated fields, under the block and wedges etc.). The third step involved the comparison between the dose calculated for a specific treatment plan with the doses measured with TLD dosimeters in the Alderson phantom.(author)

  10. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  11. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  12. Monte Carlo dose verification of prostate patients treated with simultaneous integrated boost intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    To evaluate the dosimetric differences between Superposition/Convolution (SC) and Monte Carlo (MC) calculated dose distributions for simultaneous integrated boost (SIB) prostate cancer intensity modulated radiotherapy (IMRT) compared to experimental (film) measurements and the implications for clinical treatments. Twenty-two prostate patients treated with an in-house SIB-IMRT protocol were selected. SC-based plans used for treatment were re-evaluated with EGS4-based MC calculations for treatment verification. Accuracy was evaluated with-respect-to film-based dosimetry. Comparisons used gamma (γ)-index, distance-to-agreement (DTA), and superimposed dose distributions. The treatment plans were also compared based on dose-volume indices and 3-D γ index for targets and critical structures. Flat-phantom comparisons demonstrated that the MC algorithm predicted measurements better than the SC algorithm. The average PTVprostate D98 agreement between SC and MC was 1.2% ± 1.1. For rectum, the average differences in SC and MC calculated D50 ranged from -3.6% to 3.4%. For small bowel, there were up to 30.2% ± 40.7 (range: 0.2%, 115%) differences between SC and MC calculated average D50 index. For femurs, the differences in average D50 reached up to 8.6% ± 3.6 (range: 1.2%, 14.5%). For PTVprostate and PTVnodes, the average gamma scores were >95.0%. MC agrees better with film measurements than SC. Although, on average, SC-calculated doses agreed with MC calculations within the targets within 2%, there were deviations up to 5% for some patient's treatment plans. For some patients, the magnitude of such deviations might decrease the intended target dose levels that are required for the treatment protocol, placing the patients in different dose levels that do not satisfy the protocol dose requirements

  13. Formal Verification of Self-Assembling Systems

    CERN Document Server

    Sterling, Aaron

    2010-01-01

    This paper introduces the theory and practice of formal verification of self-assembling systems. We interpret a well-studied abstraction of nanomolecular self assembly, the Abstract Tile Assembly Model (aTAM), into Computation Tree Logic (CTL), a temporal logic often used in model checking. We then consider the class of "rectilinear" tile assembly systems. This class includes most aTAM systems studied in the theoretical literature, and all (algorithmic) DNA tile self-assembling systems that have been realized in laboratories to date. We present a polynomial-time algorithm that, given a tile assembly system T as input, either provides a counterexample to T's rectilinearity or verifies whether T has a unique terminal assembly. Using partial order reductions, the verification search space for this algorithm is reduced from exponential size to O(n^2), where n x n is the size of the assembly surface. That reduction is asymptotically the best possible. We report on experimental results obtained by translating tile ...

  14. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  15. Verification and Validation of Flight Critical Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  16. Verification and validation of control system software

    International Nuclear Information System (INIS)

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  17. Skilled Impostor Attacks Against Fingerprint Verification Systems And Its Remedy

    OpenAIRE

    Gottschlich, Carsten

    2015-01-01

    Fingerprint verification systems are becoming ubiquitous in everyday life. This trend is propelled especially by the proliferation of mobile devices with fingerprint sensors such as smartphones and tablet computers, and fingerprint verification is increasingly applied for authenticating financial transactions. In this study we describe a novel attack vector against fingerprint verification systems which we coin skilled impostor attack. We show that existing protocols for performance evaluatio...

  18. Verification of Three Dimensional Triangular Prismatic Discrete Ordinates Transport Code ENSEMBLE-TRIZ by Comparison with Monte Carlo Code GMVP

    Science.gov (United States)

    Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi

    2014-06-01

    This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.

  19. Verification of Monte Carlo calculations of the neutron flux in the carousel channels of the TRIGA Mark II reactor, Ljubljana

    International Nuclear Information System (INIS)

    In this work experimental verification of Monte Carlo neutron flux calculations in the carousel facility (CF) of the 250 kW TRIGA Mark II reactor at the Jozef Stefan Institute is presented. Simulations were carried out using the Monte Carlo radiation-transport code, MCNP4B. The objective of the work was to model and verify experimentally the azimuthal variation of neutron flux in the CF for core No. 176, set up in April 2002. '1'9'8Au activities of Al-Au(0.1%) disks irradiated in 11 channels of the CF covering 180'0 around the perimeter of the core were measured. The comparison between MCNP calculation and measurement shows relatively good agreement and demonstrates the overall accuracy with which the detailed spectral characteristics can be predicted by calculations.(author)

  20. Fingerprint verification on medical image reporting system.

    Science.gov (United States)

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties. PMID:18178287

  1. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  2. A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification

    Science.gov (United States)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.

  3. Spectral history model in DYN3D: Verification against coupled Monte-Carlo thermal-hydraulic code BGCore

    International Nuclear Information System (INIS)

    Highlights: • Pu-239 based spectral history method was tested on 3D BWR single assembly case. • Burnup of a BWR fuel assembly was performed with the nodal code DYN3D. • Reference solution was obtained by coupled Monte-Carlo thermal-hydraulic code BGCore. • The proposed method accurately reproduces moderator density history effect for BWR test case. - Abstract: This research focuses on the verification of a recently developed methodology accounting for spectral history effects in 3D full core nodal simulations. The traditional deterministic core simulation procedure includes two stages: (1) generation of homogenized macroscopic cross section sets and (2) application of these sets to obtain a full 3D core solution with nodal codes. The standard approach adopts the branch methodology in which the branches represent all expected combinations of operational conditions as a function of burnup (main branch). The main branch is produced for constant, usually averaged, operating conditions (e.g. coolant density). As a result, the spectral history effects that associated with coolant density variation are not taken into account properly. Number of methods to solve this problem (such as micro-depletion and spectral indexes) were developed and implemented in modern nodal codes. Recently, we proposed a new and robust method to account for history effects. The methodology was implemented in DYN3D and involves modification of the few-group cross section sets. The method utilizes the local Pu-239 concentration as an indicator of spectral history. The method was verified for PWR and VVER applications. However, the spectrum variation in BWR core is more pronounced due to the stronger coolant density change. The purpose of the current work is investigating the applicability of the method to BWR analysis. The proposed methodology was verified against recently developed BGCore system, which couples Monte Carlo neutron transport with depletion and thermal-hydraulic solvers and

  4. Performance evaluation of fingerprint verification systems.

    Science.gov (United States)

    Cappelli, Raffaele; Maio, Dario; Maltoni, Davide; Wayman, James L; Jain, Anil K

    2006-01-01

    This paper is concerned with the performance evaluation of fingerprint verification systems. After an initial classification of biometric testing initiatives, we explore both the theoretical and practical issues related to performance evaluation by presenting the outcome of the recent Fingerprint Verification Competition (FVC2004). FVC2004 was organized by the authors of this work for the purpose of assessing the state-of-the-art in this challenging pattern recognition application and making available a new common benchmark for an unambiguous comparison of fingerprint-based biometric systems. FVC2004 is an independent, strongly supervised evaluation performed at the evaluators' site on evaluators' hardware. This allowed the test to be completely controlled and the computation times of different algorithms to be fairly compared. The experience and feedback received from previous, similar competitions (FVC2000 and FVC2002) allowed us to improve the organization and methodology of FVC2004 and to capture the attention of a significantly higher number of academic and commercial organizations (67 algorithms were submitted for FVC2004). A new, "Light" competition category was included to estimate the loss of matching performance caused by imposing computational constraints. This paper discusses data collection and testing protocols, and includes a detailed analysis of the results. We introduce a simple but effective method for comparing algorithms at the score level, allowing us to isolate difficult cases (images) and to study error correlations and algorithm "fusion." The huge amount of information obtained, including a structured classification of the submitted algorithms on the basis of their features, makes it possible to better understand how current fingerprint recognition systems work and to delineate useful research directions for the future. PMID:16402615

  5. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  6. The CTBT verification system. Entering rough waters?

    International Nuclear Information System (INIS)

    Five years after the Comprehensive Nuclear Test Ban Treaty (CTBT) was opened for signature, progress towards entry into force has been slowing. - Political uncertainty about the timing of entry into force is complicating the work of the CTBT Organization's Preparatory Commission (PrepCom). - The US decision, announced on 21 August 2001, not to pay its full share of financial contributions to the PrepCom and withdraw from activities not related to the International Monitoring System (IMS) may put it in non-compliance as a signatory to the treaty. - States need to continue to support the work of the PrepCom by providing it with the necessary financial and technical means. Gaps left by the US decision need to be filled by other states. - Completing the IMS remains a priority task which will need patience and support from all member states of the PrepCom. - Establishing an effective regime for on-site inspections is greatly complicated by the new US policy. Those states in favour of a flexible regime need to redouble their efforts, including increased input into the development of an Operational Manual. - States need to overcome undue concerns about confidentiality and create an open verification regime that makes its data available to scientific and humanitarian relief organisations. - Taken together, these efforts will enable the PrepCom to complete its task of setting up the CTBT's verification system in the foreseeable future. - Washington should live up to its commitment as a signatory to the CTBT and support the whole range of PrepCom activities. - The Article XIV conference should urge the US to reconsider its new policy of reducing support to the PrepCom

  7. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  8. Integrated safety management system verification: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-10

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalization of an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR, 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System (ISMS). Guidance and expectations have been provided to PNNL by incorporation into the operating contract (Contract DE-ACM-76FL0 1830) and by letter. The contract requires that the contractor submit a description of their ISMS for approval by DOE. PNNL submitted their proposed Safety Management System Description for approval on November 25,1997. RL tentatively approved acceptance of the description pursuant to a favorable recommendation from this review. The Integrated Safety Management System Verification is a review of the adequacy of the ISMS description in fulfilling the requirements of the DEAR and the DOE Policy. The purpose of this review is to provide the Richland Operations Office Manager with a recommendation for approval of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and to verify the extent and maturity of ISMS implementation within the Laboratory. Further the review will provide a model for other DOE laboratories managed by the Office of Assistant Secretary for Energy Research.

  9. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  10. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  11. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  12. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    Energy Technology Data Exchange (ETDEWEB)

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  13. Verification of the model of a photon beam of 6 MV in a Monte Carlo planning comparison with collapsed cone in in homogeneous medium

    International Nuclear Information System (INIS)

    We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)

  14. Development of Palmprint Verification System Using Biometrics

    Institute of Scientific and Technical Information of China (English)

    G. Shobha; M. Krishna; S.C. Sharma

    2006-01-01

    Palmprint verification system using Biometrics is one of the emerging technologies, which recognizes a person based on the principle lines, wrinkles and ridges on the surface of the palm. These line structures are stable and remain unchanged throughout the life of an individual. More importantly, no two palmprints from different individuals are the same, and normally people do not feel uneasy to have their palmprint images taken for testing. Therefore palmprint recognition offers a promising future for medium-security access control systems. In this paper, a new approach for personal authentication using hand images is discussed. Gray-Scale palm images are captured using a digital camera at a resolution of 640′480. Each of these gray-scale images is aligned and then used to extract palmprint and hand geometry features. These features are then used for authenticating users. The image acquisition setup used here is inherently simple and it does not employ any special illumination nor does it use any pegs that might cause any inconvenience to users. Experimental results show that the designed system achieves an acceptable level of performance.

  15. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  16. First verification and validation steps of mendel release 1.0 cycle code system

    International Nuclear Information System (INIS)

    For each new code system, verification and validation process is a need to prove the efficiency and accuracy of the calculated physical quantities. MENDEL is the new CEA depletion code system, whose first release was done at the end of 2013. It offers iso-capacity with the already well-established DARWIN. MENDEL is the successor of DARWIN, and can be used as a stand-alone code system for reactor cycle studies to compute interest output quantities. MENDEL also provides its depletion solvers to both Monte Carlo TRIPOLI-4® and deterministic APOLLO3® transport code systems. The purpose of this paper is to present the first contributions to MENDEL release 1.0 verification and validation process. This first release has been used with nuclear data coming from both JEFF-3.1.1 and ENDF/B-VII.1 nuclear data evaluations, and its results are compared either with experimental data, either with DARWIN results. (author)

  17. Verification of the spectral history correction method with fully coupled Monte Carlo code BGCore

    International Nuclear Information System (INIS)

    Recently, a new method for accounting for burnup history effects on few-group cross sections was developed and implemented in the reactor dynamic code DYN3D. The method relies on the tracking of the local Pu-239 density which serves as an indicator of burnup spectral history. The validity of the method was demonstrated in PWR and VVER applications. However, the spectrum variation in BWR core is more pronounced due to the stronger coolant density change. Therefore, the purpose of the current work is to further investigate the applicability of the method to BWR analysis. The proposed methodology was verified against recently developed BGCore system, which couples Monte Carlo neutron transport with depletion and thermal hydraulic solvers and thus capable of providing a reference solution for 3D simulations. The results dearly show that neglecting the spectral history effects leads to a very large deviation (e.g. 2000 pcm in reactivity) from fee reference solution. However, a very good agreement between DYN3D and BGCore is observed (on the order of 200 pcm in reactivity), when the. Pu-correction method is applied. (author)

  18. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  19. On Integrating Deductive Synthesis and Verification Systems

    OpenAIRE

    Kneuss, Etienne; Kuncak, Viktor; Kuraj, Ivan; Suter, Philippe

    2013-01-01

    We describe techniques for synthesis and verification of recursive functional programs over unbounded domains. Our techniques build on top of an algorithm for satisfiability modulo recursive functions, a framework for deductive synthesis, and complete synthesis procedures for algebraic data types. We present new counterexample-guided algorithms for constructing verified programs. We have implemented these algorithms in an integrated environment for interactive verification and synthesis from ...

  20. Simulation Monte Carlo as a method of verification of the characterization of fountains in ophthalmic brachytherapy; Simulacion Monte Carlo como metodo de verificacion de la caracterizacion de fuentes en braquiterapia oftalmica

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz Lora, A.; Miras del Rio, H.; Terron Leon, J. A.

    2013-07-01

    Following the recommendations of the IAEA, and as a further check, they have been Monte Carlo simulation of each one of the plates that are arranged at the Hospital. The objective of the work is the verification of the certificates of calibration and intends to establish criteria of action for its acceptance. (Author)

  1. Verification of the shift Monte Carlo code with the C5G7 reactor benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Sly, N. C.; Mervin, B. T. [Dept. of Nuclear Engineering, Univ. of Tennessee, 311 Pasqua Engineering Building, Knoxville, TN 37996-2300 (United States); Mosher, S. W.; Evans, T. M.; Wagner, J. C. [Oak Ridge National Laboratory, PO Box 2008, Oak Ridge, TN 37831 (United States); Maldonado, G. I. [Dept. of Nuclear Engineering, Univ. of Tennessee, 311 Pasqua Engineering Building, Knoxville, TN 37996-2300 (United States)

    2012-07-01

    Shift is a new hybrid Monte Carlo/deterministic radiation transport code being developed at Oak Ridge National Laboratory. At its current stage of development, Shift includes a parallel Monte Carlo capability for simulating eigenvalue and fixed-source multigroup transport problems. This paper focuses on recent efforts to verify Shift's Monte Carlo component using the two-dimensional and three-dimensional C5G7 NEA benchmark problems. Comparisons were made between the benchmark eigenvalues and those output by the Shift code. In addition, mesh-based scalar flux tally results generated by Shift were compared to those obtained using MCNP5 on an identical model and tally grid. The Shift-generated eigenvalues were within three standard deviations of the benchmark and MCNP5-1.60 values in all cases. The flux tallies generated by Shift were found to be in very good agreement with those from MCNP. (authors)

  2. Verification of Monte Carlo transport codes FLUKA, GEANT4 and SHIELD for radiation protection purposes at relativistic heavy ion accelerators

    International Nuclear Information System (INIS)

    The crucial problem for radiation shielding design at heavy ion accelerator facilities with beam energies of several GeV/n is the source term problem. Experimental data on double differential neutron yields from thick targets irradiated with high-energy uranium nuclei are lacking. At present there are not many Monte Carlo multipurpose codes that can work with primary high-energy uranium nuclei. These codes use different physical models for simulating nucleus-nucleus reactions. Therefore, verification of the codes with available experimental data is very important for selection of the most reliable code for practical tasks. This paper presents comparisons of the FLUKA, GEANT4 and SHIELD code simulations with experimental data on neutron production at 1 GeV/n 238U beam interaction with a thick Fe target

  3. Verification of Burned Core Modeling Method for Monte Carlo Simulation of HANARO

    International Nuclear Information System (INIS)

    The reactor core has been managed well by the HANARO core management system called HANAFMS. The heterogeneity of the irradiation device and core made the neutronic analysis difficult and sometimes doubtable. To overcome the deficiency, MCNP was utilized in neutron transport calculation of the HANARO. For the most part, a MCNP model with the assumption that all fuels are filled with fresh fuel assembly showed acceptable analysis results for a design of experimental devices and facilities. However, it sometimes revealed insufficient results in the design, which requires good accuracy like neutron transmutation doping (NTD), because it didn't consider the flux variation induced by depletion of the fuel. In this study, a depleted-core modeling method previously proposed was applied to build burned core model of HANARO and verified through a comparison of the calculated result from the depleted-core model and that from an experiment. The modeling method to establish a depleted-core model for the Monte Carlo simulation was verified by comparing the neutron flux distribution obtained by the zirconium activation method and the reaction rate of 30Si(n, γ) 31Si obtained by a resistivity measurement method. As a result, the reaction rate of 30Si(n, γ) 31Si also agreed well with about 3% difference. It was therefore concluded that the modeling method and resulting depleted-core model developed in this study can be a very reliable tool for the design of the planned experimental facility and a prediction of its performance in HANARO

  4. Verification of Burned Core Modeling Method for Monte Carlo Simulation of HANARO

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Dongkeun; Kim, Myongseop [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    The reactor core has been managed well by the HANARO core management system called HANAFMS. The heterogeneity of the irradiation device and core made the neutronic analysis difficult and sometimes doubtable. To overcome the deficiency, MCNP was utilized in neutron transport calculation of the HANARO. For the most part, a MCNP model with the assumption that all fuels are filled with fresh fuel assembly showed acceptable analysis results for a design of experimental devices and facilities. However, it sometimes revealed insufficient results in the design, which requires good accuracy like neutron transmutation doping (NTD), because it didn't consider the flux variation induced by depletion of the fuel. In this study, a depleted-core modeling method previously proposed was applied to build burned core model of HANARO and verified through a comparison of the calculated result from the depleted-core model and that from an experiment. The modeling method to establish a depleted-core model for the Monte Carlo simulation was verified by comparing the neutron flux distribution obtained by the zirconium activation method and the reaction rate of {sup 30}Si(n, γ) {sup 31}Si obtained by a resistivity measurement method. As a result, the reaction rate of {sup 30}Si(n, γ) {sup 31}Si also agreed well with about 3% difference. It was therefore concluded that the modeling method and resulting depleted-core model developed in this study can be a very reliable tool for the design of the planned experimental facility and a prediction of its performance in HANARO.

  5. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    International Nuclear Information System (INIS)

    Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam

  6. Formal verification of safety protocol in train control system

    OpenAIRE

    Zhang, Yan; TANG, TAO; Li, Keping; Mera Sanchez de Pedro, Jose Manuel; Zhu, Li; Zhao, Lin; Xu, Tianhua

    2011-01-01

    In order to satisfy the safety-critical requirements, the train control system (TCS) often employs a layered safety communication protocol to provide reliable services. However, both description and verification of the safety protocols may be formidable due to the system complexity. In this paper, interface automata (IA) are used to describe the safety service interface behaviors of safety communication protocol. A formal verification method is proposed to describe the safety communication pr...

  7. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  8. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  9. Verification of Transformer Restricted Earth Fault Protection by using the Monte Carlo Method

    OpenAIRE

    KRSTIVOJEVIC, J. P.; DJURIC, M. B.

    2015-01-01

    The results of a comprehensive investigation of the influence of current transformer (CT) saturation on restricted earth fault (REF) protection during power transformer magnetization inrush are presented. Since the inrush current during switch-on of unloaded power transformer is stochastic, its values are obtained by: (i) laboratory measurements and (ii) calculations based on the input data obtained by the Monte Carlo (MC) simulation. To make a detailed assessment of the curre...

  10. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    OpenAIRE

    Locke, C.; Zavgorodni, S.

    2008-01-01

    Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam and patient geometries need to be transferred to MC codes, such as BEAMnrc and DOS...

  11. Safety Verification of Interconnected Hybrid Systems Using Barrier Certificates

    OpenAIRE

    Guobin Wang; Jifeng He; Jing Liu; Haiying Sun; Zuohua Ding; Miaomiao Zhang

    2016-01-01

    Safety verification determines whether any trajectory starting from admissible initial states would intersect with a set of unsafe states. In this paper, we propose a numerical method for verifying safety of a network of interconnected hybrid dynamical systems with a state constraint based on bilinear sum-of-squares programming. The safety verification is conducted by the construction of a function of states called barrier certificate. We consider a finite number of interconnected hybrid syst...

  12. An Optimized Signature Verification System for Vehicle Ad hoc NETwork

    OpenAIRE

    Mamun, Mohammad Saiful Islam; Miyaji, Atsuko

    2012-01-01

    This paper1 presents an efficient approach to an existing batch verification system on Identity based group signature (IBGS) which can be applied to any Mobile ad hoc network device including Vehicle Ad hoc Networks (VANET). We propose an optimized way to batch signatures in order to get maximum throughput from a device in runtime environment. In addition, we minimize the number of pairing computations in batch verification proposed by B. Qin et al. for large scale VANET. We introduce a batch...

  13. Integrated testing and verification system for research flight software

    Science.gov (United States)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  14. NPP Temelin instrumentation and control system upgrade and verification

    International Nuclear Information System (INIS)

    Two units of Ver 1000 type of the Czech nuclear power plant Temelin, which are under construction are being upgraded with the latest instrumentation and control system delivered by WEC. To confirm that the functional design of the new Reactor Control and Limitation System, Turbine Control System and Plant Control System are in compliance with the Czech customer requirements and that these requirements are compatible with NPP Temelin upgraded technology, the verification of the control systems has been performed. The method of transient analysis has been applied. Some details of the NPP Temelin Reactor Control and Limitation System verification are presented.(author)

  15. The EURATOM interim verification system of Natrium declarations

    International Nuclear Information System (INIS)

    Full text: Euratom's inspection scheme at the Thermal Oxide Reprocessing Plant THORP makes use of the Near Real Time Material Accountancy (NRTMA) tool developed by the operator to obtain In-Process Inventory (IPI) information in plutonium containing areas that are not accessible during operation. From a verification point of view, appropriate authentication of the data provided, NRTMA is sufficient to fulfil the timeliness component of the safeguards approach without further need to stop the process and carry out interim inventories, the flow component being covered by routine verification activities on a continuous inspection basis. Since commissioning time, the validation of the NRTMA process models, data collection methods, computational techniques and data transmission protocols has been the subject of extensive analyses and agreement procedures involving several departments of the Euratom Safeguards Office (ESO). The Terms of Reference regarding the use of NRTMA by E.S.O are recorded in the NRTMA Framework Document. In this context, the Data Evaluation Sector of ESO was assigned the task of studying the statistical models and anomaly resolutions tools used by NRTMA. The analysis rested on the description of a condensed operational process model and a related statistical model for the measurement errors and their propagation. It included the definition of particular testing variables suitable for safeguards anomaly detection and sensitivity studies based on Monte Carlo simulations. The conclusion of Euratom's statistical analysis was that, although BNFL's NRTMA data collection system is perfectly adequate for interim inventory data declaration, its anomaly detection margin, which was primarily designed to provide a very conservative process control tool, is too narrow for safeguards purposes. This high sensitivity, which, from the operator's point of view, may a desirable feature for taking early corrective action, would create a resource and efficiency problem

  16. Environmental radiation measurement in CTBT verification system

    International Nuclear Information System (INIS)

    This paper introduces the technical requirements of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Radionuclide Stations, the CTBT-related activities carried out by the Japan Atomic Energy Research Institute (JAERI), and the ripple effects of such acquired radionuclide data on general researches. The International Monitoring System (IMS), which is one of the CTBT verification regime. Consists of 80 radionuclide air monitoring stations (of those, 40 stations monitor noble gas as well) and 16 certified laboratories that support these stations throughout the world. For radionuclide air monitoring under the CTBT, the stations collect particulates in the atmosphere on a filter and determine by gamma-ray spectrometry the presence or absence of any radionuclides (e.g. 140Ba, 131I, 99Mo, 132Te, 103Ru, 141Ce, 147Nd, 95Zr, etc.) that offer clear evidence of possible nuclear explosion. Minimum technical requirements are stringently set for the radionuclide air monitoring stations: 500 m3/h air flow rate, 24-hour acquisition time, 10 to 30 Bq/m3 of detection sensitivity for 140Ba, and less than 7 consecutive days, or total of 15 days, a year of shutdown at the stations. For noble gas monitoring, on the other hand, the stations separate Xe from gas elements in the atmosphere and, after purifying and concentrating it, measure 4 nuclides, 131mXe, 133Xe, 133mXe, and 135Xe, by gamma-ray spectrometry or beta-gamma coincidence method. Minimum technical requirements are also set for the noble gas measurement: 0.4 m3/h air flow rate, a full capacity of 10 m3, and 1 Bq/m3 of detection sensitivity for 133Xe, etc. On the request of the Ministry of Education, Culture, Sports and Technology, the JAERI is currently undertaking the establishment of the CTBT radionuclide monitoring stations at both Takasaki (both particle and noble gas) and Okinawa (particle), the certified laboratory at JAERI Tokai, and the National Data Center (NDC 2) at JAERI Tokai, which handles radionuclide data, as

  17. Professional verification a guide to advanced functional verification

    CERN Document Server

    Wilcox, Paul

    2007-01-01

    The Profession of Verification.- Verification Challenges.- Advanced Funtional Verification.- Successful Verification.- Professional Verification.- The Unified Verification Methodology.- The Unified Verification Methodology.- UVM System-Level Design.- Control Digital Subsystems.- Algorithmic Digital Subsystems.- Analog/RF Subsystems.- Integration and System Verification.- Tools of the Trade.- System-Level Design.- Formal Verification Tools.- Testbench Development.- Advanced Testbenches.- Hardware-Based Verification.

  18. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  19. Modular Verification of Interactive Systems with an Application to Biology

    Directory of Open Access Journals (Sweden)

    P. Milazzo

    2011-01-01

    Full Text Available We propose sync-programs, an automata-based formalism for the description of biological systems, and a modular verification technique for such a formalism that allows properties expressed in the universal fragment of CTL to be verified on suitably chosen fragments of models, rather than on whole models. As an application we show the modelling of the lac operon regulation process and the modular verification of some properties. Verification of properties is performed by using the NuSMV model checker and we show that by applying our modular verification technique we can verify properties in shorter times than those necessary to verify the same properties in the whole model.

  20. The Design of Computerized Procedure System to Support Concurrent Verification

    International Nuclear Information System (INIS)

    Concurrent verification is a human performance tool for a task or job in which potential adverse outcomes lead to immediate and possibly irreversible harm to the plant or personnel. One of the main objectives of concurrent verification is to prevent human error in selecting control equipment. In Shin Ulchin 1 and 2 Main Control Room(MCR), a computerized procedure system (CPS) supports concurrent verification process of the operator. The performer and the verifier use the same CPS screen at his/her own workstation and the verifier monitors the performer's control actions at his/her own workstation. The result of concurrent verification is recorded in the CPS and the step controller, who executes procedure, check the result in his CPS screen. The proposed design will be verified and validated during HFE V and V of Shin Ulchin 1 and 2 MCR

  1. Probabilistic verification of partially observable dynamical systems

    OpenAIRE

    Gyori, Benjamin M.; Paulin, Daniel; Palaniappan, Sucheendra K.

    2014-01-01

    The construction and formal verification of dynamical models is important in engineering, biology and other disciplines. We focus on non-linear models containing a set of parameters governing their dynamics. The value of these parameters is often unknown and not directly observable through measurements, which are themselves noisy. When treating parameters as random variables, one can constrain their distribution by conditioning on observations and thereby constructing a posterior probability ...

  2. Finger-print based human verification system

    OpenAIRE

    Klopčič, Uroš

    2009-01-01

    The diploma thesis presents an algorithm for verification based on fingerprints. In order to achieve a simple and modular design, the algorithm is divided into number of steps. As an input, the algorithm takes greyscale fingerprint images. First, segmentation is performed where the background is separated from the area which represents the fingerprint. This is followed by the calculation of orientation field of the fingerprint using the gradient method and local frequency estimation. Both val...

  3. Verification measurements and clinical evaluation of the iPlan RT Monte Carlo dose algorithm for 6 MV photon energy

    Science.gov (United States)

    Petoukhova, A. L.; van Wingerden, K.; Wiggenraad, R. G. J.; van de Vaart, P. J. M.; van Egmond, J.; Franken, E. M.; van Santvoort, J. P. C.

    2010-08-01

    This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.

  4. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    CERN Document Server

    Locke, C

    2008-01-01

    Monte Carlo (MC) method provides the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations to treatment planning quality assurance process. This process involves MC dose calculations for the treatment plans produced clinically. To perform these calculations a number of treatment plan parameters specifying radiation beam and patient geometries needs to be transferred to MC codes such as BEAMnrc and DOSXYZnrc. Extracting these parameters from DICOM files is not a trivial task that has previously been performed mostly using Matlab-based software. This paper describes DICOM tags that contain information required for MC modeling of conformal and IMRT plans, and reports development of an in-house DICOM interface through a library (named Vega) of platform-independent, object-oriented C++ codes. Vega library is small and succinct, offering just the fundamental functions for reading/modifying/writing DICOM files in a ...

  5. Advancing system-level verification using UVM in SystemC

    OpenAIRE

    Barnasconi, Martin; Pêcheux, François; Vörtler, Thilo

    2014-01-01

    This paper introduces the Universal Verification Methodology (UVM) using SystemC and C++ (UVM-SystemC), to advance system-level verification practices. UVM-SystemC enables the creation of a structured, modular, configurable and reusable test bench environment. Unlike other initiatives to create UVM in SystemC, the presented proof-of-concept class library uses identical constructs as defined in the UVM standard for test and sequence creation, verification component and test bench configuration...

  6. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  7. Prompt γ-ray activation analysis of Martian analogues at the FRM II neutron reactor and the verification of a Monte Carlo planetary radiation environment model

    International Nuclear Information System (INIS)

    Planetary radiation environment modelling is important to assess the habitability of a planetary body. It is also useful when interpreting the γ-ray data produced by natural emissions from radioisotopes or prompt γ-ray activation analysis. γ-ray spectra acquired in orbit or in-situ by a suitable detector can be converted into meaningful estimates of the concentration of certain elements on the surface of a planet. This paper describes the verification of a Monte Carlo model developed using the MCNPX code at University of Leicester. The model predicts the performance of a geophysical package containing a γ-ray spectrometer operating at a depth of up to 5 m. The experimental verification of the Monte Carlo model was performed at the FRM II facility in Munich, Germany. The paper demonstrates that the model is in good agreement with the experimental data and can be used to model the performance of an in-situ γ-ray spectrometer.

  8. Verification for a GEOSHIELD application to the SMART vessel fluence by a Monte Carlo simulation

    International Nuclear Information System (INIS)

    In general the two dimensional discrete ordinates transport code DORT has been used for an evaluation of neutron and gamma fluxes during a shielding design of nuclear reactors. It is very complicated and it takes too much time for shielding designers to prepare input data such as a geometrical modeling and a source distribution and to process an output of the results from the shielding analysis. The GEOSHIELD code was developed to save the time spent preparing a geometrical model and an output processing. The GEOSHIELD code is composed of a module for a geometrical modeling by using a combinatorial geometry, a module for a fixed source redistribution, a module for a DORT processing, and a module for a graphical processing of the output activities. The evaluation of an irradiation of a fast neutron which has an energy of higher than 1.0 MeV is very important to verify the integrity of an internal structure including a pressure vessel. The GEOSHIELD code was applied to evaluate a fast neutron fluence distribution on the internal structures inside the reactor pressure vessel of the SMART reactor and the MCNP was used for verification of the result from the GEOSHIELD calculation. Result of the GEOSHIELD and MCNP showed good agreement each other. (author)

  9. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  10. Verification of Transformer Restricted Earth Fault Protection by using the Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    KRSTIVOJEVIC, J. P.

    2015-08-01

    Full Text Available The results of a comprehensive investigation of the influence of current transformer (CT saturation on restricted earth fault (REF protection during power transformer magnetization inrush are presented. Since the inrush current during switch-on of unloaded power transformer is stochastic, its values are obtained by: (i laboratory measurements and (ii calculations based on the input data obtained by the Monte Carlo (MC simulation. To make a detailed assessment of the current transformer performance the uncertain input data for the CT model were obtained by applying the MC method. In this way, different levels of remanent flux in CT core are taken into consideration. By the generated CT secondary currents, the algorithm for REF protection based on phase comparison in time domain is tested. On the basis of the obtained results, a method of adjustment of the triggering threshold in order to ensure safe operation during transients, and thereby improve the algorithm security, has been proposed. The obtained results indicate that power transformer REF protection would be enhanced by using the proposed adjustment of triggering threshold in the algorithm which is based on phase comparison in time domain.

  11. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    Science.gov (United States)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC

  12. FORMAL VERIFICATION OF REAL TIME DISTRIBUTED SYSTEMS USING B METHOD

    Directory of Open Access Journals (Sweden)

    AYAMAN M. WAHBA,

    2011-04-01

    Full Text Available Throughout the previous years, the complexity and size of digital systems has increased dramatically, as a result design flow phases changed a lot. Simulation used to be the most common procedure to assure the correctness of a system under design, but it cannot exhaustively examine all the execution scenarios of the system. A different approach to validate a system by formally reasoning the system behavior is Formal verification, where the system implementation is checked against the requirements or the properties to be satisfied. The most common paradigms are based on theorem proving, model checking and language containment. This paper presents an application of the B method to the formalization and verification of a simplified flight control system, as an example of a system consisting of a number of distributed computing devices that are interconnected together through digital communication channels.

  13. Standard guide for acoustic emission system performance verification

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 System performance verification methods launch stress waves into the examination article on which the sensor is mounted. The resulting stress wave travels in the examination article and is detected by the sensor(s) in a manner similar to acoustic emission. 1.2 This guide describes methods which can be used to verify the response of an Acoustic Emission system including sensors, couplant, sensor mounting devices, cables and system electronic components. 1.3 Acoustic emission system performance characteristics, which may be evaluated using this document, include some waveform parameters, and source location accuracy. 1.4 Performance verification is usually conducted prior to beginning the examination. 1.5 Performance verification can be conducted during the examination if there is any suspicion that the system performance may have changed. 1.6 Performance verification may be conducted after the examination has been completed. 1.7 The values stated in SI units are to be regarded as standard. No other u...

  14. Verification and Validation of Model-Based Autonomous Systems

    Science.gov (United States)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  15. Diffusion Monte Carlo: Exponentially inefficent for large systems?

    CERN Document Server

    Nemec, Norbert

    2009-01-01

    The computational cost of a Monte Carlo algorithm can only be meaningfully discussed when taking into account the magnitude of the resulting statistical error. Aiming for a fixed error per particle, we study the scaling behavior of the diffusion Monte Carlo method for large quantum systems. We identify the correlation within the population of walkers as the dominant scaling factor for large systems. While this factor is negligible for small and medium sized systems that are typically studied, it ultimately shows exponential scaling beyond system sizes that can be estimated straightforwardly for each specific system.

  16. Formal Verification of the Danish Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method for formal verification of the new Danish railway interlocking systems. We made a generic and reconfigurable model of the behaviors and high-level safety properties of non-collision and nonderailment. This model accommodates sequential release – a new feature in...... railway networks of industrial size....

  17. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...

  18. Verification of DeCART/CAPP code system for VHTR by HTTR core analysis

    International Nuclear Information System (INIS)

    The DeCART/CAPP code system has been developed and verified against the numerical benchmark calculations for an HTTR. The reference calculations have been carried out by the Monte Carlo McCARD code in which a double heterogeneity model was used. Verification results show that the DeCART/CAPP code system gives less negative MTC and RTC than the McCARD code, and thus the DeCART code overestimates the multiplication factors at states with a high moderator and reflector temperature. However, the DeCART/CAPP code system predicts more negative FTC than McCARD code does. In the depletion calculation for the HTTR single cell and single block, the error of DeCART/CAPP code system increases with the burnup (authors)

  19. Verification of tritium production evaluation procedure using Monte Carlo code MCNP for in-pile test of fusion blanket with JMTR

    International Nuclear Information System (INIS)

    To evaluate exactly the total amount of tritium production in tritium breeding materials during in-pile test with JMTR, the 'tritium monitor' has been produced and evaluation of total tritium generation was done by using 'tritium monitor' in preliminary in-pile mock-up, and verification of procedure concerning tritium production evaluation was conducted by using Monte Carlo code MCNP and nuclear cross section library of FSXLIBJ3R2. Li-Al alloy (Li 3.4 wt.%, 95.5% enrichment of 6Li) was selected as tritium monitor material for the evaluation on the total amount of tritium production in high 6Li enriched materials. From the results of preliminary experiment, calculated amounts of total tritium production at each 'tritium monitor', which was installed in the preliminary in-pile mock-up, were about 50-290% higher than the measured values. Concerning tritium measurement, increase of measurement error in tritium leak form measuring system to measure small amount of tritium (0.2-0.7 mCi in tritium monitor) was found in the results of present experiment. The tendency for overestimation of calculated thermal neutron flux in the range of 1-6x1013 n cm-2 per s was found in JMTR and the reason may be due to the beryllium cross section data base in JENDL3.2

  20. ECG based biometrics verification system using LabVIEW

    OpenAIRE

    Sunil Kumar Singla; Ankit Sharma

    2010-01-01

    Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc.) are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print ca...

  1. Learning-Based Compositional Verification for Synchronous Probabilistic Systems

    OpenAIRE

    Feng L.; Han T.; Kwiatkowska M.; Parker D.

    2011-01-01

    We present novel techniques for automated compositional verification of synchronous probabilistic systems. First, we give an assume-guarantee framework for verifying probabilistic safety properties of systems modelled as discrete-time Markov chains. Assumptions about system components are represented as probabilistic finite automata (PFAs) and the relationship between components and assumptions is captured by weak language inclusion. In order to implement this framework, we develop a semi-alg...

  2. SU-E-T-384: Experimental Verification of a Monte Carlo Linear Accelerator Model Using a Radiochromic Film Stack Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    McCaw, T; Culberson, W; DeWerd, L [University of Wisconsin Medical Radiation Research Center, Madison, WI (United States)

    2014-06-01

    Purpose: To experimentally verify a Monte Carlo (MC) linear accelerator model for the simulation of intensity-modulated radiation therapy (IMRT) treatments of moving targets. Methods: A Varian Clinac™ 21EX linear accelerator was modeled using the EGSnrc user code BEAMnrc. The mean energy, radial-intensity distribution, and divergence of the electron beam incident on the bremsstrahlung target were adjusted to achieve agreement between simulated and measured percentage-depth-dose and transverse field profiles for a 6 MV beam. A seven-field step-and-shoot IMRT lung procedure was prepared using Varian Eclipse™ treatment planning software. The plan was delivered using a Clinac™ 21EX linear accelerator and measured with a Gafchromic™ EBT2 film stack dosimeter (FSD) in two separate static geometries: within a cylindrical water-equivalent-plastic phantom and within an anthropomorphic chest phantom. Two measurements were completed in each setup. The dose distribution for each geometry was simulated using the EGSnrc user code DOSXYZnrc. MC geometries of the treatment couch, cylindrical phantom, and chest phantom were developed by thresholding CT data sets using MATLAB™. The FSD was modeled as water. The measured and simulated dose distributions were normalized to the median dose within the FSD. Results: Using an electron beam with a mean energy of 6.05 MeV, a Gaussian radial-intensity distribution with a full width at half maximum of 1.5 mm, and a divergence of 0°, the measured and simulated dose profiles agree within 1.75% and 1 mm. Measured and simulated dose distributions within both the cylindrical and chest phantoms agree within 3% over 94% of the FSD volume. The overall uncertainty in the FSD measurements is 3.1% (k=1). Conclusion: MC simulations agree with FSD measurements within measurement uncertainty, thereby verifying the accuracy of the linear accelerator model for the simulation of IMRT treatments of static geometries. The experimental verification

  3. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  4. MORSE Monte Carlo radiation transport code system

    International Nuclear Information System (INIS)

    This report is an addendum to the MORSE report, ORNL-4972, originally published in 1975. This addendum contains descriptions of several modifications to the MORSE Monte Carlo Code, replacement pages containing corrections, Part II of the report which was previously unpublished, and a new Table of Contents. The modifications include a Klein Nishina estimator for gamma rays. Use of such an estimator required changing the cross section routines to process pair production and Compton scattering cross sections directly from ENDF tapes and writing a new version of subroutine RELCOL. Another modification is the use of free form input for the SAMBO analysis data. This required changing subroutines SCORIN and adding new subroutine RFRE. References are updated, and errors in the original report have been corrected

  5. Initial performance of the advanced inventory verification sample system (AVIS)

    International Nuclear Information System (INIS)

    This paper describes the requirements, design and initial performance of the Advanced Inventory Verification Sample System (AVIS) a non-destructive assay (NDA) system to measure small samples of bulk mixed uranium-plutonium oxide (MOX) materials (powders and pellets). The AVIS design has evolved from previously developed conceptual physics and engineering designs for the Inventory Sample Verification System (INVS), a safeguards system for nondestructive assay of small samples. The AVIS is an integrated gamma-neutron system. Jointly designed by the Nuclear Material Control Center (NMCC) and the Los Alamos National Laboratory (LANL), AVIS is intended to meet a performance specification of a total measurement uncertainty of less than 0.5% in the neutron (240Pueffective) measurement. This will allow the AVIS to replace destructive chemical analysis for many samples, with concomitant cost, exposure and waste generation savings for the facility. Data taken to date confirming the performance of the AVIS is presented.

  6. Initial performance of the advanced inventory verification sample system (AVIS)

    Energy Technology Data Exchange (ETDEWEB)

    Marlow, Johnna B [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Rael, Carlos D [Los Alamos National Laboratory

    2009-01-01

    This paper describes the requirements, design and initial performance of the Advanced Inventory Verification Sample System (AVIS) a non-destructive assay (NDA) system to measure small samples of bulk mixed uranium-plutonium oxide (MOX) materials (powders and pellets). The AVIS design has evolved from previously developed conceptual physics and engineering designs for the Inventory Sample Verification System (INVS), a safeguards system for nondestructive assay of small samples. The AVIS is an integrated gamma-neutron system. Jointly designed by the Nuclear Material Control Center (NMCC) and the Los Alamos National Laboratory (LANL), AVIS is intended to meet a performance specification of a total measurement uncertainty of less than 0.5% in the neutron ({sup 240}Pu{sub effective}) measurement. This will allow the AVIS to replace destructive chemical analysis for many samples, with concomitant cost, exposure and waste generation savings for the facility. Data taken to date confirming the performance of the AVIS is presented.

  7. Conducting Verification and Validation of Multi- Agent Systems

    Directory of Open Access Journals (Sweden)

    Nedhal Al Saiyd

    2012-10-01

    Full Text Available Verification and Validation (V&V is a series of activities ,technical and managerial ,which performed bysystem tester not the system developer in order to improve the system quality ,system reliability andassure that product satisfies the users operational needs. Verification is the assurance that the products ofa particular development phase are consistent with the requirements of that phase and preceding phase(s,while validation is the assurance that the final product meets system requirements. an outside agency canbe used to performed V&V, which is indicate by Independent V&V, or IV&V, or by a group within theorganization but not the developer, referred to as Internal V&V. Use of V&V often accompanies testing,can improve quality assurance, and can reduce risk. This paper putting guidelines for performing V&V ofMulti-Agent Systems (MAS.

  8. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used to...... specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  9. Verification of heterogeneous multi-agent system using MCMAS

    Science.gov (United States)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  10. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  11. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  12. Image-based fingerprint verification system using LabVIEW

    OpenAIRE

    Sunil K. Singla

    2008-01-01

    Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image) based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is imp...

  13. A Proof System for Compositional Verification of Probabilistic Concurrent Processes

    OpenAIRE

    Mio, Matteo; Simpson, Alex

    2013-01-01

    We present a formal proof system for compositional verification of probabilistic concurrent processes. Processes are specified using an SOS-style process algebra with probabilistic operators. Properties are expressed using a probabilistic modal μ-calculus. And the proof system is formulated as a sequent calculus in which sequents are given a quantitative interpretation. A key feature is that the probabilistic scenario is handled by introducing the notion of Markov proof, according to which pr...

  14. Measurability and Safety Verification for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger;

    2011-01-01

    Dealing with the interplay of randomness and continuous time is important for the formal verification of many real systems. Considering both facets is especially important for wireless sensor networks, distributed control applications, and many other systems of growing importance. An important tr......, we enhance tool support to work effectively on such general models. Experimental evidence is provided demonstrating the applicability of our approach on three case studies, tackled using a prototypical implementation....

  15. Advanced NSTS propulsion system verification study

    Science.gov (United States)

    Wood, Charles

    1989-01-01

    The merits of propulsion system development testing are discussed. The existing data base of technical reports and specialists is utilized in this investigation. The study encompassed a review of all available test reports of propulsion system development testing for the Saturn stages, the Titan stages, and the Space Shuttle main propulsion system. The knowledge on propulsion system development and system testing available from specialists and managers was also 'tapped' for inclusion.

  16. Meaningful timescales from Monte Carlo simulations of molecular systems

    CERN Document Server

    Costa, Liborio I

    2016-01-01

    A new Markov Chain Monte Carlo method for simulating the dynamics of molecular systems with atomistic detail is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.

  17. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  18. The verification of neutron activation analysis support system (cooperative research)

    International Nuclear Information System (INIS)

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k0 method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k0 method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  19. Monitoring and Commissioning Verification Algorithms for CHP Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brambley, Michael R.; Katipamula, Srinivas; Jiang, Wei

    2008-03-31

    This document provides the algorithms for CHP system performance monitoring and commissioning verification (CxV). It starts by presenting system-level and component-level performance metrics, followed by descriptions of algorithms for performance monitoring and commissioning verification, using the metric presented earlier. Verification of commissioning is accomplished essentially by comparing actual measured performance to benchmarks for performance provided by the system integrator and/or component manufacturers. The results of these comparisons are then automatically interpreted to provide conclusions regarding whether the CHP system and its components have been properly commissioned and where problems are found, guidance is provided for corrections. A discussion of uncertainty handling is then provided, which is followed by a description of how simulations models can be used to generate data for testing the algorithms. A model is described for simulating a CHP system consisting of a micro-turbine, an exhaust-gas heat recovery unit that produces hot water, a absorption chiller and a cooling tower. The process for using this model for generating data for testing the algorithms for a selected set of faults is described. The next section applies the algorithms developed to CHP laboratory and field data to illustrate their use. The report then concludes with a discussion of the need for laboratory testing of the algorithms on a physical CHP systems and identification of the recommended next steps.

  20. Verification and Validation Plan for Flight Performance Requirements on the CEV Parachute Assembly System

    Science.gov (United States)

    Morris, Aaron L.; Olson, Leah M.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is engaged in a multi-year design and test campaign aimed at qualifying a parachute recovery system for human use on the Orion Spacecraft. Orion has parachute flight performance requirements that will ultimately be verified through the use of Monte Carlo multi-degree of freedom flight simulations. These simulations will be anchored by real world flight test data and iteratively improved to provide a closer approximation to the real physics observed in the inherently chaotic inflation and steady state flight of the CPAS parachutes. This paper will examine the processes necessary to verify the flight performance requirements of the human rated spacecraft. The focus will be on the requirements verification and model validation planned on CPAS.

  1. System maintenance verification and validation plan for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    TWRS Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the verification and validation approach for system documentation changes within the database system

  2. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  3. Applicability of quasi-Monte Carlo for lattice systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hartung, Tobias [King' s College London (United Kingdom). Dept. of Mathematics; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leovey, Hernan; Griewank, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Mathematics; Mueller-Preussker, Michael [Berlin Humboldt-Univ. (Germany). Dept. of Physics

    2013-11-15

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N{sup -1/2}, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N{sup -1}, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  4. Applicability of quasi-Monte Carlo for lattice systems

    International Nuclear Information System (INIS)

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N-1/2, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N-1, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  5. Applicability of Quasi-Monte Carlo for lattice systems

    CERN Document Server

    Ammon, Andreas; Jansen, Karl; Leovey, Hernan; Griewank, Andreas; Müller-Preussker, Micheal

    2013-01-01

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like $N^{-1/2}$, where $N$ is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to $N^{-1}$, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  6. Monte Carlo capabilities of the SCALE code system

    International Nuclear Information System (INIS)

    Highlights: • Foundational Monte Carlo capabilities of SCALE are described. • Improvements in continuous-energy treatments are detailed. • New methods for problem-dependent temperature corrections are described. • New methods for sensitivity analysis and depletion are described. • Nuclear data, users interfaces, and quality assurance activities are summarized. - Abstract: SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2

  7. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  8. Simulation of Cone Beam CT System Based on Monte Carlo Method

    CERN Document Server

    Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing

    2014-01-01

    Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.

  9. Hybrid dynamical systems: verification and error trajectory search

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan

    Prague: Institute of Computer Science AS CR, 2010, s. 125-126. ISBN 978-80-87136-07-2. [SNA ’10. Seminar on Numerical Analysis. Nové Hrady (CZ), 18.01.2010-22.01.2010] R&D Projects: GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : hybrid dynamical systems * verification Subject RIV: BA - General Mathematics

  10. Constraints for Continuous Reachability in the Verification of Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan; She, Z.

    Berlin : Springer, 2006 - (Calmet, J.; Ida, T.; Wang, D.), s. 196-210 ISBN 3-540-39728-0. - (Lecture Notes in Artificial Intelligence. 4120). [AISC 2006. International Conference on Artificial Intelligence and Symbolic Computation /8./. Beijing (CN), 20.09.2006-22.09.2006] Grant ostatní: AVACS(DE) SFB/TR 14 AVACS Institutional research plan: CEZ:AV0Z10300504 Keywords : hybrid systems * verification * constraint solving Subject RIV: BA - General Mathematics

  11. Timing requirement description diagrams for real-time system verification

    OpenAIRE

    Fontan, Benjamin; De Saqui-Sannes, Pierre; Apvrille, Ludovic

    2008-01-01

    TURTLE is a real-time UML profile introduced a few years ago to address the analysis, design and deployment of time-constrained systems. The profile has a formal semantics. Further, it is supported by an open source toolkit: TTool. The latter enables formal verification of TURTLE models without specific knowledge of mathematical notations or formal languages. This paper proposes to extend TURTLE to cover the requirement capture phase, to check a model against formally expressed temporal requi...

  12. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  13. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  14. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  15. On the Symbolic Verification of Timed Systems

    DEFF Research Database (Denmark)

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif; Hulgaard, Henrik

    1999-01-01

    symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed in a...

  16. Monte Carlo calculations of neutron thermalization in a heterogeneous system

    International Nuclear Information System (INIS)

    The slowing down of neutrons in a heterogeneous system (a slab geometry) of uranium and heavy water has been investigated by Monte Carlo methods. Effects on the neutron spectrum due to the thermal motions of the scattering and absorbing atoms are taken into account. It has been assumed that the speed distribution of the moderator atoms are Maxwell-Boltzmann in character

  17. Verification of a Probabilistic Model for A Distribution System with Integration of Dispersed Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte; Waagepetersen, Rasmus; Sörensen, Stefan

    2008-01-01

    In order to assess the present and predict the future distribution system performance using a probabilistic model, verification of the model is crucial. This paper illustrates the error caused by using traditional Monte Carlo (MC) based probabilistic load flow (PLF) when involving tap......-changing transformers. This is due to the time-dependent feature of tap position, which is not taken into account in the traditional MC method. A modified MC method using block sampling is proposed to take into account the time-dependent feature of the tap position. Block sampling estimates the tap position in the load...... flow calculation at the present hour by using results obtained from the previous hours. Simulation results show big improvement when block sampling is used as compared to the traditional MC method. Finally, 100 simulation runs of the PLF are performed to further ascertain the accuracy of the results...

  18. Nuclear warhead verification system based on local area network

    International Nuclear Information System (INIS)

    The authors built up a nuclear warhead verification system based on Local Area Network (LAN). The hardware of the system consists of an Intranet server, two personnel computers and a portable high pure germanium gamma-ray spectrometer. The software, Blanking-out, is made of two modules designed for the inspecting and inspected parties respectively. Both of the two modules are interactive and communicate through the Intranet. When the system operates, any sensitive information carried by the high energy resolved gamma-ray spectrum collected by the detector will be meddled by the inspected module before the package of spectral data is sent to the inspecting module to prevent a disclosure of sensitive information. After receiving of the spectral data, the inspecting module will display on the screen of the inspecting PC terminal the verification results in a from of blanked spectrum (spectral blanking-out form) or a form of a piece of phrase (phrase form). The guideline to the spectral blanking-out is that: for those ranges of energy obligated to be inspected, the inspected module should ensure that the inspecting module can display their truly and subjective spectrum; and for the other ranges of energy, the inspected module can blank-out certain parts of the spectra based on the knowledge of sensitive information. Phrase form is rather straight-forward, answering the question whether the inspected warhead is a 'uranium-type', or 'Plutonium-type' or 'non-nuclear type'. The authors conducted a demonstration on some surrogates for nuclear warheads to see whether the nuclear warhead verification system possesses that capability of identifying the type of a warhead and blanking sensitive spectral information. The demonstration was carried out successfully. Phrase form is especially recommendable due to its stronger capability to prevent sensitive information from disclosure and its higher verification credibility. The demonstration also disclosed some deficiencies of the

  19. Monte Carlo simulation in systems biology

    OpenAIRE

    Schellenberger, Jan

    2010-01-01

    Constraint Based Reconstruction and Analysis (COBRA) is a framework within the field of Systems Biology which aims to understand cellular metabolism through the analysis of large scale metabolic models. These models are based on meticulously curated reconstructions of all chemical reactions in an organism. Instead of attempting to predict the exact state of the biological system, COBRA describes the physiological constraints that the system must satisfy and studies the range of solutions sati...

  20. Dynamic Verification of a Large Discrete System

    OpenAIRE

    Gunnarsson, Johan; Germundsson, Roger

    1996-01-01

    Symbolic algebraic analysis techniques are applied to the landing gear subsystem in the Swedish fighter aircraft, JAS 39 Gripen. Our methods are based on polynomials over finite fields (with Boolean algebra and propositional logic as special cases). Polynomials are used to represent the basic dynamic equations for the processes (controller and plant) as well as static properties of these. Temporal algebra (or temporal logic) is used to represent specifications of system behaviour. These speci...

  1. Verification station for Sandia/Rockwell Plutonium Protection system

    International Nuclear Information System (INIS)

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  2. Airworthiness Compliance Verification Method Based on Simulation of Complex System

    Institute of Scientific and Technical Information of China (English)

    XU Haojun; LIU Dongliang; XUE Yuan; ZHOU Li; MIN Guilong

    2012-01-01

    A study is conducted on a new airworthiness compliance verification method based on pilot-aircraft-environment complex system simulation.Verification scenarios are established by “block diagram” method based on airworthiness criteria..A pilot-aircraft-environment complex model is set up and a virtual flight testing method based on connection of MATLAB/Simulink and Flightgear is proposed.Special researches are conducted on the modeling of pilot manipulation stochastic parameters and manipulation in critical situation.Unfavorable flight factors of certain scenario are analyzed,and reliability modeling of important system is researched.A distribution function of small probability event and the theory on risk probability measurement are studied.Nonlinear function is used to depict the relationship between the cumulative probability and the extremum of the critical parameter.A synthetic evaluation model is set up,modified genetic algorithm (MGA) is applied to ascertaining the distribution parameter in the model,and amore reasonable result is obtained.A clause about vehicle control functions (VCFs) verification in MIL-HDBK-516B is selected as an example to validate the practicability of the method.

  3. Linear accelerator and MLC Monte Carlo model in EGSnrc/BEAMnrc system

    International Nuclear Information System (INIS)

    In radiotherapy (RT) the Monte Carlo (MC) method is used especially as a golden standard for comparison with measured data, modelling of a detector response or a treatment planning system (TPS) calculation. In last years, using of an IMRT technique has been rapidly increasing in developed countries and also in Czech Republic. Benefit of IMRT is strongly dependent on a precise treatment planning and accurate target volume dose delivery. Precise dose delivering is interwoven with IGRT and for dose distribution verification the phantom measurement is usually performed. However, this way of verification may not discover possible dose distribution discrepancy between TPS and a patient. MC modelling is suitable especially for this situation. Recently increased interest in this field can be noticed in foreign authors works. Although IMRT technique is commonly used in our country this topic is not investigated. The MC simulation for IMRT is based on a verified linear accelerator treatment head model. In our case the model of Clinac 2100 C/D (Varian Medical System) is created. EGSnrc/BEAMnrc code is used for MC modelling. This system is adjusted especially for RT simulation. The model can be divided into several subsections. It is necessary to know precise accelerator treatment head geometrical parameters (from a producer) and convert them to EGSnrc/BEAMnrc. (authors)

  4. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    Science.gov (United States)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  5. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2016-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... checking (BMC) and inductive reasoning, it is verified that the generated model instance satisfies the generated safety properties. Using this method, we are able to verify the safety properties for model instances corresponding to railway networks of industrial size. Experiments show that BMC is also...

  6. Verification and validation of decision support Expert Systems

    International Nuclear Information System (INIS)

    Expert Systems are being designed or considered for both on-line process control and off-line decision support in the handling of hazardous materials. There are possibilities for both positive and negative impacts on chemical industry fire/explosion loss potentials. The authors review some guidelines and tools available for the verification and validation (V and V) of expert systems. Finally, they illustrate some of the points discussed by describing the development of a prototype expert system for hazardous materials classification and loss prevention engineering decision support

  7. Monte Carlo Capabilities of the SCALE Code System

    Science.gov (United States)

    Rearden, B. T.; Petrie, L. M.; Peplow, D. E.; Bekar, K. B.; Wiarda, D.; Celik, C.; Perfetti, C. M.; Ibrahim, A. M.; Hart, S. W. D.; Dunn, M. E.

    2014-06-01

    SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a "plug-and-play" framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE's graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2, to be released in 2014, will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.

  8. LHC Beam Loss Monitoring System Verification Applications

    CERN Document Server

    Dehning, B; Zamantzas, C; Jackson, S

    2011-01-01

    The LHC Beam Loss Mon­i­tor­ing (BLM) sys­tem is one of the most com­plex in­stru­men­ta­tion sys­tems de­ployed in the LHC. In ad­di­tion to protecting the col­lid­er, the sys­tem also needs to pro­vide a means of di­ag­nos­ing ma­chine faults and de­liv­er a feed­back of loss­es to the control room as well as to sev­er­al sys­tems for their setup and analysis. It has to trans­mit and pro­cess sig­nals from al­most 4’000 mon­i­tors, and has near­ly 3 mil­lion con­fig­urable pa­ram­e­ters. The system was de­signed with re­li­a­bil­i­ty and avail­abil­i­ty in mind. The spec­i­fied op­er­a­tion and the fail-safe­ty stan­dards must be guar­an­teed for the sys­tem to per­form its func­tion in pre­vent­ing su­per­con­duc­tive mag­net de­struc­tion caused by par­ti­cle flux. Main­tain­ing the ex­pect­ed re­li­a­bil­i­ty re­quires ex­ten­sive test­ing and ver­i­fi­ca­tion. In this paper we re­port our most re­cent ad­di­t...

  9. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  10. Proof System for Plan Verification under 0-Approximation Semantics

    CERN Document Server

    Zhao, Xishun

    2011-01-01

    In this paper a proof system is developed for plan verification problems $\\{X\\}c\\{Y\\}$ and $\\{X\\}c\\{KW p\\}$ under 0-approximation semantics for ${\\mathcal A}_K$. Here, for a plan $c$, two sets $X,Y$ of fluent literals, and a literal $p$, $\\{X\\}c\\{Y\\}$ (resp. $\\{X\\}c\\{KW p\\}$) means that all literals of $Y$ become true (resp. $p$ becomes known) after executing $c$ in any initial state in which all literals in $X$ are true.Then, soundness and completeness are proved. The proof system allows verifying plans and generating plans as well.

  11. Tutorial on validation and verification of knowledge-based systems

    International Nuclear Information System (INIS)

    The validation and verification (V and V) of Knowledge-Based Systems (KBS) becomes of increasing concern as these systems are fielded and embedded in the everyday operations of Electric Power and other industries particularly so when life-critical and environment-critical aspects are involved. This paper provides in this tutorial a V and V perspective on the nature of KBS components, an appropriate life-cycle, and applicable testing approaches. The authors consider why KBS V and V may be both easier and harder than traditional means, and the authors conclude with a series of practical V and V guidelines

  12. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  13. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    Formal model verification has proven a powerful tool for verifying and validating the properties of a system. Central to this class of techniques is the construction of an accurate formal model for the system being investigated. Unfortunately, manual construction of such models can be a resource...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences of...... input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  14. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  15. Diffusion Monte Carlo calculations of three-body systems

    Institute of Scientific and Technical Information of China (English)

    L(U) Meng-Jiao; REN Zhong-Zhou; LIN Qi-Hu

    2012-01-01

    The application of the diffusion Monte Carlo algorithm in three-body systems is studied.We develop a program and use it to calculate the property of various three-body systems.Regular Coulomb systems such as atoms,molecules,and ions are investigated.The calculation is then extended to exotic systems where electrons are replaced by muons.Some nuclei with neutron halos are also calculated as three-body systems consisting of a core and two external nucleons.Our results agree well with experiments and others' work.

  16. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  17. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  18. Fixed-Node Diffusion Monte Carlo of Lithium Systems

    CERN Document Server

    Rasch, Kevin

    2015-01-01

    We study lithium systems over a range of number of atoms, e.g., atomic anion, dimer, metallic cluster, and body-centered cubic crystal by the diffusion Monte Carlo method. The calculations include both core and valence electrons in order to avoid any possible impact by pseudo potentials. The focus of the study is the fixed-node errors, and for that purpose we test several orbital sets in order to provide the most accurate nodal hyper surfaces. We compare our results to other high accuracy calculations wherever available and to experimental results so as to quantify the the fixed-node errors. The results for these Li systems show that fixed-node quantum Monte Carlo achieves remarkably high accuracy total energies and recovers 97-99 % of the correlation energy.

  19. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand; Sørensen, Mathias Grund; Jensen, Peter Gjøl

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  20. Recursive and Backward Reasoning in the Verification of Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan; She, Z.

    Setúbal: INSTICC, 2008, s. 65-71. ISBN 978-989-8111-30-2. [ICINCO 2008. International Conference on Information in Control, Automation and Robotics /5./. Funchal (PT), 11.05.2008-15.05.2008] R&D Projects: GA ČR GC201/08/J020 Grant ostatní: National Key Basic Research Program(CN) 2005CB321902; Program for Excellent Tallents of Beijing(CN) 20071D1600600410 Institutional research plan: CEZ:AV0Z10300504 Keywords : hybrid systems * verification * constraint propagation Subject RIV: JC - Computer Hardware ; Software

  1. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  2. An eclectic quadrant of rule based system verification: work grounded in verification of fuzzy rule bases

    OpenAIRE

    Viaene, Stijn; Wets, G.; Vanthienen, Jan; Dedene, Guido

    1999-01-01

    In this paper, we used a research approach based on grounded theory in order to classify methods proposed in literature that try to extend the verification of classical rule bases to the case of fuzzy knowledge modeling. Within this area of verification we identify two dual lines of thought respectively leading to what is termed respectively static and dynamic anomaly detection methods. The major outcome of the confrontation of both approaches is that their results, most often stated in terms...

  3. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, A.H.; Wupper, H.; Boon, M.

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, w

  4. Experimental verification of a commercial Monte Carlo-based dose calculation module for high-energy photon beams

    Science.gov (United States)

    Künzler, Thomas; Fotina, Irina; Stock, Markus; Georg, Dietmar

    2009-12-01

    The dosimetric performance of a Monte Carlo algorithm as implemented in a commercial treatment planning system (iPlan, BrainLAB) was investigated. After commissioning and basic beam data tests in homogenous phantoms, a variety of single regular beams and clinical field arrangements were tested in heterogeneous conditions (conformal therapy, arc therapy and intensity-modulated radiotherapy including simultaneous integrated boosts). More specifically, a cork phantom containing a concave-shaped target was designed to challenge the Monte Carlo algorithm in more complex treatment cases. All test irradiations were performed on an Elekta linac providing 6, 10 and 18 MV photon beams. Absolute and relative dose measurements were performed with ion chambers and near tissue equivalent radiochromic films which were placed within a transverse plane of the cork phantom. For simple fields, a 1D gamma (γ) procedure with a 2% dose difference and a 2 mm distance to agreement (DTA) was applied to depth dose curves, as well as to inplane and crossplane profiles. The average gamma value was 0.21 for all energies of simple test cases. For depth dose curves in asymmetric beams similar gamma results as for symmetric beams were obtained. Simple regular fields showed excellent absolute dosimetric agreement to measurement values with a dose difference of 0.1% ± 0.9% (1 standard deviation) at the dose prescription point. A more detailed analysis at tissue interfaces revealed dose discrepancies of 2.9% for an 18 MV energy 10 × 10 cm2 field at the first density interface from tissue to lung equivalent material. Small fields (2 × 2 cm2) have their largest discrepancy in the re-build-up at the second interface (from lung to tissue equivalent material), with a local dose difference of about 9% and a DTA of 1.1 mm for 18 MV. Conformal field arrangements, arc therapy, as well as IMRT beams and simultaneous integrated boosts were in good agreement with absolute dose measurements in the

  5. Design and performance verification of a passive propellant management system

    Science.gov (United States)

    Hess, D. A.; Regnier, W. W.

    1978-01-01

    This paper describes the design and verification testing of a reusable passive propellant management system. The system was designed to acquire propellant in low- or zero-g environments and also retain this propellant under high axially directed accelerations that may be experienced during launch and orbit-to-orbit transfer. The system design requirements were established to satisfy generally the requirements for a large number of potential NASA and military applications, such as orbit-to-orbit shuttles and satellite vehicles. The resulting concept was a multicompartmented tank with independent surface tension acquisition channels in each compartment. The tank was designed to provide a minimum expulsion efficiency of 98 percent when subjected to the simultaneous conditions of acceleration, vibration, and outflow. The system design has the unique capability to demonstrate low-g performance in a 1-g test environment, and the test program summarized was structured around this capability.

  6. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  7. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  8. Crew Exploration Vehicle (CEV) Potable Water System Verification Description

    Science.gov (United States)

    Peterson, Laurie; DeVera, Jean; Vega, Leticia; Adam, Nik; Steele, John; Gazda, Daniel; Roberts, Michael

    2009-01-01

    The Crew Exploration Vehicle (CEV), also known as Orion, will ferry a crew of up to six astronauts to the International Space Station (ISS), or a crew of up to four astronauts to the moon. The first launch of CEV is scheduled for approximately 2014. A stored water system on the CEV will supply the crew with potable water for various purposes: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain quality of the water transferred from the Orbiter to the ISS and stored in Contingency Water Containers (CWCs). In the CEV water system, the ionic silver biocide is expected to be depleted from solution due to ionic silver plating onto the surfaces of the materials within the CEV water system, thus negating its effectiveness as a biocide. Since the biocide depletion is expected to occur within a short amount of time after loading the water into the CEV water tanks at the Kennedy Space Center (KSC), an additional microbial control is a 0.1 micron point of use filter that will be used at the outlet of the Potable Water Dispenser (PWD). Because this may be the first time NASA is considering a stored water system for longterm missions that does not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point of use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform testing to help alleviate those concerns related to the CEV water system. Results from the test plans laid out in the paper presented to SAE last year (Crew Exploration Vehicle (CEV) Potable Water System Verification Coordination, 2008012083) will be detailed in this paper. Additionally, recommendations for the CEV verification will be described for risk mitigation in meeting the physicochemical and microbiological requirements on the CEV PWS.

  9. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release – a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  10. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates...... sequential release - a feature in the new Danish interlocking systems. The generic model and safety properties can be instantiated with interlocking configuration data, resulting in a concrete model in the form of a Kripke structure, and in high-level safety properties expressed as state invariants. Using...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....

  11. Formal Verification for Embedded Systems Design Based on MDE

    Science.gov (United States)

    Do Nascimento, Francisco Assis Moreira; da Silva Oliveira, Marcio Ferreira; Wagner, Flávio Rech

    This work presents a Model Driven Engineering (MDE) approach for the automatic generation of a network of timed automata from the functional specification of an embedded application described using UML class and sequence diagrams. By means of transformations on the UML model of the embedded system, a MOF-based representation for the network of timed automata is automatically obtained, which can be used as input to formal verification tools, as the Uppaal model checker, in order to validate desired functional and temporal properties of the embedded system specification. Since the network of timed automata is automatically generated, the methodology can be very useful for the designer, making easier the debugging and formal validation of the system specification. The paper describes the defined transformations between models, which generate the network of timed automata as well as the textual input to the Uppaal model checker, and illustrates the use of the methodology with a case study to show the effectiveness of the approach.

  12. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    OpenAIRE

    Tseng, Kuo-Kun; Zeng, Fufu; Ip, W. H.; Wu, C.H.

    2016-01-01

    With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental...

  13. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    Science.gov (United States)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  14. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10-9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  15. FAST CONVERGENT MONTE CARLO RECEIVER FOR OFDM SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Wu Lili; Liao Guisheng; Bao Zheng; Shang Yong

    2005-01-01

    The paper investigates the problem of the design of an optimal Orthogonal Frequency Division Multiplexing (OFDM) receiver against unknown frequency selective fading. A fast convergent Monte Carlo receiver is proposed. In the proposed method, the Markov Chain Monte Carlo (MCMC) methods are employed for the blind Bayesian detection without channel estimation. Meanwhile, with the exploitation of the characteristics of OFDM systems, two methods are employed to improve the convergence rate and enhance the efficiency of MCMC algorithms.One is the integration of the posterior distribution function with respect to the associated channel parameters, which is involved in the derivation of the objective distribution function; the other is the intra-symbol differential coding for the elimination of the bimodality problem resulting from the presence of unknown fading channels. Moreover, no matrix inversion is needed with the use of the orthogonality property of OFDM modulation and hence the computational load is significantly reduced. Computer simulation results show the effectiveness of the fast convergent Monte Carlo receiver.

  16. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  17. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  18. A new method for commissioning Monte Carlo treatment planning systems

    Science.gov (United States)

    Aljarrah, Khaled Mohammed

    2005-11-01

    The Monte Carlo method is an accurate method for solving numerical problems in different fields. It has been used for accurate radiation dose calculation for radiation treatment of cancer. However, the modeling of an individual radiation beam produced by a medical linear accelerator for Monte Carlo dose calculation, i.e., the commissioning of a Monte Carlo treatment planning system, has been the bottleneck for the clinical implementation of Monte Carlo treatment planning. In this study a new method has been developed to determine the parameters of the initial electron beam incident on the target for a clinical linear accelerator. The interaction of the initial electron beam with the accelerator target produces x-ray and secondary charge particles. After successive interactions in the linac head components, the x-ray photons and the secondary charge particles interact with the patient's anatomy and deliver dose to the region of interest. The determination of the initial electron beam parameters is important for estimating the delivered dose to the patients. These parameters, such as beam energy and radial intensity distribution, are usually estimated through a trial and error process. In this work an easy and efficient method was developed to determine these parameters. This was accomplished by comparing calculated 3D dose distributions for a grid of assumed beam energies and radii in a water phantom with measurements data. Different cost functions were studied to choose the appropriate function for the data comparison. The beam parameters were determined on the light of this method. Due to the assumption that same type of linacs are exactly the same in their geometries and only differ by the initial phase space parameters, the results of this method were considered as a source data to commission other machines of the same type.

  19. On Sensor Data Verification for Participatory Sensing Systems

    Directory of Open Access Journals (Sweden)

    Diego Mendez

    2013-03-01

    Full Text Available In this paper we study the problem of sensor data verification in Participatory Sensing (PS systems using an air quality/pollution monitoring application as a validation example. Data verification, in the context of PS, consists of the process of detecting and removing spatial outliers to properly reconstruct the variables of interest. We propose, implement, and test a hybrid neighborhood-aware algorithm for outlier detection that considers the uneven spatial density of the users, the number of malicious users, the level of conspiracy, and the lack of accuracy and malfunctioning sensors. The algorithm utilizes the Delaunay triangulation and Gaussian Mixture Models to build neighborhoods based on the spatial and non-spatial attributes of each location. This neighborhood definition allows us to demonstrate thatit is not necessary to apply accurate but computationally expensive estimators to the entire dataset to obtain good results, as equally accurate but computationally cheaper methods can also be applied to part of the data and obtain good results as well. Our experimental results show that our hybrid algorithm performs as good as the best estimator while reducing the execution time considerably.

  20. Verification of Monte-Carlo transport codes FLUKA, GEANT4 and SHIELD for radiation protection purposes at relativistic heavy-ion accelerators

    International Nuclear Information System (INIS)

    The crucial problem for radiation shielding design at heavy-ion accelerator facilities with beam energies to several GeV/n is the source term problem. Experimental data on double differential neutron yields from thick target irradiated with high-energy uranium nuclei are lacking. At present, there are not many Monte-Carlo multipurpose codes that can work with primary high-energy uranium nuclei. These codes use different physical models for simulation of nucleus-nucleus reactions. Therefore, verification of the codes with available experimental data is very important for selection of the most reliable code for practical tasks. This paper presents comparisons of the FLUKA, GEANT4 and SHIELD codes simulations with the experimental data on neutron production at 1 GeV/n 238U beam interaction with thick Fe target

  1. Development of a practical Monte Carlo based fuel management system for the Penn State University Breazeale Research Reactor (PSBR)

    International Nuclear Information System (INIS)

    A practical fuel management system for the he Pennsylvania State University Breazeale Research Reactor (PSBR) based on the advanced Monte Carlo methodology was developed from the existing fuel management tool in this research. Several modeling improvements were implemented to the old system. The improved fuel management system can now utilize the burnup dependent cross section libraries generated specifically for PSBR fuel and it is also able to update the cross sections of these libraries by the Monte Carlo calculation automatically. Considerations were given to balance the computation time and the accuracy of the cross section update. Thus, certain types of a limited number of isotopes, which are considered 'important', are calculated and updated by the scheme. Moreover, the depletion algorithm of the existing fuel management tool was replaced from the predictor only to the predictor-corrector depletion scheme to account for burnup spectrum changes during the burnup step more accurately. An intermediate verification of the fuel management system was performed to assess the correctness of the newly implemented schemes against HELIOS. It was found that the agreement of both codes is good when the same energy released per fission (Q values) is used. Furthermore, to be able to model the reactor at various temperatures, the fuel management tool is able to utilize automatically the continuous cross sections generated at different temperatures. Other additional useful capabilities were also added to the fuel management tool to make it easy to use and be practical. As part of the development, a hybrid nodal diffusion/Monte Carlo calculation was devised to speed up the Monte Carlo calculation by providing more converged initial source distribution for the Monte Carlo calculation from the nodal diffusion calculation. Finally, the fuel management system was validated against the measured data using several actual PSBR core loadings. The agreement of the predicted core

  2. Monte Carlo modeling of neutron and gamma-ray imaging systems

    Science.gov (United States)

    Hall, James M.

    1997-02-01

    Detailed numerical prototypes are essential to the design of efficient and cost-effective neutron and gamma-ray imaging systems. We have exploited the unique capabilities of an LLNL-developed radiation transport code (COG) to develop code modules capable of simulating the performance of neutron and gamma-ray imaging systems over a wide range of source energies. COG allows us to simulate complex, energy-, angle-, and time-dependent radiation sources, model 3D system geometries with 'real world' complexity, specify detailed elemental and isotopic distributions and predict the responses of various types of imaging detectors with full Monte Carlo accuracy. COG references detailed, evaluated nuclear interaction databases allowing users to account for multiple scattering, energy straggling, and secondary particle production phenomena which may significantly effect the performance of an imaging system but may be difficult or even impossible to estimate using simple analytical models. In this work we will present examples illustrating the use of these routines in the analysis of industrial radiographic systems for thick target inspection, non-intrusive luggage and cargo scanning systems, and international treaty verification.

  3. Monte Carlo modeling of neutron and gamma-ray imaging systems

    International Nuclear Information System (INIS)

    Detailed numerical prototypes are essential to design of efficient and cost-effective neutron and gamma-ray imaging systems. We have exploited the unique capabilities of an LLNL-developed radiation transport code (COG) to develop code modules capable of simulating the performance of neutron and gamma-ray imaging systems over a wide range of source energies. COG allows us to simulate complex, energy-, angle-, and time-dependent radiation sources, model 3-dimensional system geometries with ''real world'' complexity, specify detailed elemental and isotopic distributions and predict the responses of various types of imaging detectors with full Monte Carlo accuray. COG references detailed, evaluated nuclear interaction databases allowingusers to account for multiple scattering, energy straggling, and secondary particle production phenomena which may significantly effect the performance of an imaging system by may be difficult or even impossible to estimate using simple analytical models. This work presents examples illustrating the use of these routines in the analysis of industrial radiographic systems for thick target inspection, nonintrusive luggage and cargoscanning systems, and international treaty verification

  4. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  5. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    Science.gov (United States)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  6. Computer aided production planning - SWZ system of order verification

    Science.gov (United States)

    Krenczyk, D.; Skolud, B.

    2015-11-01

    SWZ (System of order verification) is a computer implementation of the methodology that support fast decision making on the acceptability of a production order, which allows to determine not the best possible solution, but admissible solution that is possible to find in an acceptable time (feasible solution) and acceptable due to the existing constraints. The methodology uses the propagation of constraints techniques and reduced to test a sequence of arbitrarily selected conditions. Fulfilment of all the conditions (the conjunction) provides the ability to perform production orders. In the paper examples of the application of SWZ system comprising the steps of planning and control is presented. The obtained results allowing the determination of acceptable production flow in the system - determination of the manufacturing system parameters those that ensure execution of orders in time under the resource constraints. SWZ also allows to generate the dispatching rules as a sequence of processing operations for each production resource, performed periodically during the production flow in the system. Furthermore the example of SWZ and simulation system integration is shown. SWZ has been enhanced with a module generating files containing the script code of the system model using the internal language of simulation and visualization system.

  7. Verification of the model of a photon beam of 6 MV in a Monte Carlo planning comparison with collapsed cone in in homogeneous medium; Verificacion del modelado de un haz de fotones de 6 MV en un planificador Monte Carlo. Comparacion con Collapsed Cone en medio no homogeneo

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.

    2013-07-01

    We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ULTRASONIC AQUEOUS CLEANING SYSTEMS, SMART SONIC CORPORATION, SMART SONIC

    Science.gov (United States)

    This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...

  9. Multi-way Monte Carlo Method for Linear Systems

    OpenAIRE

    Wu, Tao; Gleich, David F.

    2016-01-01

    We study the Monte Carlo method for solving a linear system of the form $x = H x + b$. A sufficient condition for the method to work is $\\| H \\| < 1$, which greatly limits the usability of this method. We improve this condition by proposing a new multi-way Markov random walk, which is a generalization of the standard Markov random walk. Under our new framework we prove that the necessary and sufficient condition for our method to work is the spectral radius $\\rho(H^{+}) < 1$, which is a weake...

  10. Dual-use benefits of the CTBT verification system

    International Nuclear Information System (INIS)

    Since it has been completed in September 1996, the CTBT has been signed by 151 countries. Awaiting the 44 ratifications and entry into force, all of the nuclear powers have imposed unilateral moratoriums on nuclear test explosions. The end of these weapons development activities is often cited as the principal benefit of the CTBT. As the world begins to implement the Treaty, it has become clear that the development and operation of the CTBT verification system will provide a wide range of additional benefits if the data analysis products are available for dual-purpose applications. As this paper describes these could have economic and social implications, especially for countries with limited technical infrastructures. These involve, seismic monitoring, mineral exploration, scientific and technical training

  11. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    Science.gov (United States)

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  12. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.;

    2012-01-01

    of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge...

  13. Interacting multiagent systems kinetic equations and Monte Carlo methods

    CERN Document Server

    Pareschi, Lorenzo

    2014-01-01

    The description of emerging collective phenomena and self-organization in systems composed of large numbers of individuals has gained increasing interest from various research communities in biology, ecology, robotics and control theory, as well as sociology and economics. Applied mathematics is concerned with the construction, analysis and interpretation of mathematical models that can shed light on significant problems of the natural sciences as well as our daily lives. To this set of problems belongs the description of the collective behaviours of complex systems composed by a large enough number of individuals. Examples of such systems are interacting agents in a financial market, potential voters during political elections, or groups of animals with a tendency to flock or herd. Among other possible approaches, this book provides a step-by-step introduction to the mathematical modelling based on a mesoscopic description and the construction of efficient simulation algorithms by Monte Carlo methods. The ar...

  14. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  15. Verification of the use of GEANT4 and MCNPX Monte Carlo Codes for Calculations of the Depth-Dose Distributions in Water for the Proton Therapy of Eye Tumours

    Directory of Open Access Journals (Sweden)

    Grządziel Małgorzata

    2014-07-01

    Full Text Available Verification of calculations of the depth-dose distributions in water, using GEANT4 (version of 4.9.3 and MCNPX (version of 2.7.0 Monte Carlo codes, was performed for the scatterer-phantom system used in the dosimetry measurements in the proton therapy of eye tumours. The simulated primary proton beam had the energy spectra distributed according to the Gauss distribution with the cut at energy greater than that related to the maximum of the spectrum. The energy spectra of the primary protons were chosen to get the possibly best agreement between the measured relative depth-dose distributions along the central-axis of the proton beam in a water phantom and that derived from the Monte Carlo calculations separately for the both tested codes. The local depth-dose differences between results from the calculations and the measurements were mostly less than 5% (the mean value of 2.1% and 3.6% for the MCNPX and GEANT4 calculations. In the case of the MCNPX calculations, the best fit to the experimental data was obtained for the spectrum with maximum at 60.8 MeV (more probable energy, FWHM of the spectrum of 0.4 MeV and the energy cut at 60.85 MeV whereas in the GEANT4 calculations more probable energy was 60.5 MeV, FWHM of 0.5 MeV, the energy cut at 60.7 MeV. Thus, one can say that the results obtained by means of the both considered Monte Carlo codes are similar but they are not the same. Therefore the agreement between the calculations and the measurements has to be verified before each application of the MCNPX and GEANT4 codes for the determination of the depth-dose curves for the therapeutic protons.

  16. IRIS safety system and equipment design verification test plan

    International Nuclear Information System (INIS)

    The International Reactor Innovative and Secure (IRIS) is an advanced, integral, light-water cooled reactor of medium generating capacity (335 MWe), geared at near term deployment (2012-2015). IRIS is an innovative design that features an integral reactor vessel that contains all the reactor coolant system components, including the steam generators, coolant pumps, pressurizer and heaters, and control rod drive mechanisms; in addition to the: typical core, internals, control rods and neutron reflector. Other IRIS innovations also include a small, high design pressure, spherical steel containment; and a simplified passive safety system concept and equipment features that derive from its unique 'safety-by-design' IM philosophy. The IRIS ('safety-by-design')TM approach not only improves safety, but it also reduces the overall cost by allowing a significant reduction and simplification in safety systems. Moreover, IRIS improved safety supports licensing the power plant without the need for off-site emergency response planning an objective which is part of the pre-application with NRC and is also is being pursued in collaboration with IAEA. The IRIS innovative integral reactor coolant system design, as well as its innovative ('safety-by-design')TM approach features, has resulted in the need for new safety analyses and new equipment design and qualification, in order to successfully license the plant. Therefore, the IRIS design team has developed a test plan that will provide the necessary data for safety analyses verification as well as the demonstration of equipment manufacturing feasibility and operation. This paper will present the 'IRIS Safety System and Equipment Design Verification Test Plan' which develops and confirms the operation of all the IRIS unique features, and includes component manufacturing feasibility tests, component separate effects tests, component qualification tests, and integral effects tests. These tests will also provide the data necessary to

  17. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  18. Tracer verification and monitoring of containment systems (II)

    International Nuclear Information System (INIS)

    A tracer verification and monitoring system, SEAtrace trademark, has been designed and field tested which uses gas tracers to evaluate, verify, and monitor the integrity of subsurface barriers. This is accomplished using an automatic, rugged, autonomous monitoring system combined with an inverse optimization code. A gaseous tracer is injected inside the barrier and an array of wells outside the barrier are monitored. When the tracer gas is detected, a global optimization code is used to calculate the leak parameters, including leak size, location, and when the leak began. The multipoint monitoring system operates in real-time, can be used to measure both the tracer gas and soil vapor contaminants, and is capable of unattended operation for long periods of time (months). The global optimization code searches multi-dimensional open-quotes spaceclose quotes to find the best fit for all of the input parameters. These parameters include tracer gas concentration histories from multiple monitoring points, medium properties, barrier location, and the source concentration. SEAtrace trademark does not attempt to model all of the nuances associated with multi-phase, multi-component flow, but rather, the inverse code uses a simplistic forward model which can provide results which are reasonably accurate. The system has calculated leak locations to within 0.5 meters and leak radii to within 0.12 meters

  19. VALGALI: VERIFICATION AND VALIDATION TOOL FOR THE LIBRARIES PRODUCED BY THE GALILEE SYSTEM

    International Nuclear Information System (INIS)

    Full text: In this paper we present VALGALI the verification and validation tool for the libraries produced by the nuclear data processing system GALILEE. The aim of this system is to provide libraries with consistent physical data for various application codes (the deterministic transport code APOLLO2, the Monte Carlo transport code TRRIPOLI-4, the depletion code DARWIN, ...). For each library, are the data stored at the good place with the good format and so one. Are the libraries used by the various codes consistent. What is the physical quality of the cross sections and data present in the libraries. These three types of tests correspond to the classic stages of VandV. The great strength of VALGALI is to be generic and not dedicated to one application code Consequently, it is based on a common physical validation database which coverage is regularly increased. For all these test cases, the input data are declined for each relevant application code Moreover it can exist specific test case for each application code: At the present, VALGALI an check and validate the libraries of APOLLO2 and TRIPOLI4, but in the near future VALGALI wil also treat the libraries of DARWIN.

  20. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Science.gov (United States)

    2010-01-26

    ... URBAN DEVELOPMENT Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer, HUD. ] ACTION: Notice. SUMMARY: The proposed information.... This Notice Also Lists the Following Information Title of Proposal: Enterprise Income Verification...

  1. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    Science.gov (United States)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  2. Monte Carlo simulations of quantum systems on massively parallel supercomputers

    International Nuclear Information System (INIS)

    A large class of quantum physics applications uses operator representations that are discrete integers by nature. This class includes magnetic properties of solids, interacting bosons modeling superfluids and Cooper pairs in superconductors, and Hubbard models for strongly correlated electrons systems. This kind of application typically uses integer data representations and the resulting algorithms are dominated entirely by integer operations. The authors implemented an efficient algorithm for one such application on the Intel Touchstone Delta and iPSC/860. The algorithm uses a multispin coding technique which allows significant data compactification and efficient vectorization of Monte Carlo updates. The algorithm regularly switches between two data decompositions, corresponding naturally to different Monte Carlo updating processes and observable measurements such that only nearest-neighbor communications are needed within a given decomposition. On 128 nodes of Intel Delta, this algorithm updates 183 million spins per second (compared to 21 million on CM-2 and 6.2 million on a Cray Y-MP). A systematic performance analysis shows a better than 90% efficiency in the parallel implementation

  3. Algorithm Verification for A TLD Personal Dosimetry System

    International Nuclear Information System (INIS)

    Dose algorithms are used in thermoluminescence personnel dosimetry for the interpretation of the dosimeter response in terms of equivalent dose. In the present study an Automated Harshaw 6600 reader was rigorously tested prior to use for dose calculation algorithm according to the standard established by the US Department of Energy Laboratory Accreditation Program (DOELAP). Also, manual Harshaw 4500 reader was used along with the ICRU slab phantom and the RANDO phantom in experimentally determining the photon personal doses in terms of deep dose, Hp(10), shallow dose, Hp(0.07), and eye lens dose, Hp(3),. Also, a Monte Carlo simulation program (VMC-dc) free code was used to simulate RANDO phantom irradiation process. The accuracy of the automated system lies well within DOELAP tolerance limits in all test categories

  4. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  5. Research on Monte Carlo simulation method of industry CT system

    International Nuclear Information System (INIS)

    There are a series of radiation physical problems in the design and production of industry CT system (ICTS), including limit quality index analysis; the effect of scattering, efficiency of detectors and crosstalk to the system. Usually the Monte Carlo (MC) Method is applied to resolve these problems. Most of them are of little probability, so direct simulation is very difficult, and existing MC methods and programs can't meet the needs. To resolve these difficulties, particle flux point auto-important sampling (PFPAIS) is given on the basis of auto-important sampling. Then, on the basis of PFPAIS, a particular ICTS simulation method: MCCT is realized. Compared with existing MC methods, MCCT is proved to be able to simulate the ICTS more exactly and effectively. Furthermore, the effects of all kinds of disturbances of ICTS are simulated and analyzed by MCCT. To some extent, MCCT can guide the research of the radiation physical problems in ICTS. (author)

  6. Automated hardware-software system for LED's verification and certification

    Science.gov (United States)

    Chertov, Aleksandr N.; Gorbunova, Elena V.; Peretyagin, Vladimir S.; Vakulenko, Anatolii D.

    2012-10-01

    Scientific and technological progress of recent years in the production of the light emitting diodes (LEDs) has led to the expansion of areas of their application from the simplest systems to high precision lighting devices used in various fields of human activity. However, for all technology development at the present time it is very difficult to choose one or another brand of LEDs for realization of concrete devices designed for the implementation of high precision spatial and color measurements of various objects. In the world there are many measurement instruments for determining the various parameters of LEDs, but none of them are not capable to estimate comprehensively the LEDs spatial, spectral, and color parameters with the necessary accuracy and speed. This problem can be solved by using an automated hardware-software system for LED's verification and certification, developed by specialists of the OEDS chair of National Research University ITMO in Russia. The paper presents the theoretical aspects of the analysis of LED's spatial, spectral and color parameters by using mentioned of automated hardware-software system. The article also presents the results of spatial, spectral, and color parameters measurements of some LEDs brands.

  7. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    International Nuclear Information System (INIS)

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  8. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    Energy Technology Data Exchange (ETDEWEB)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed.

  9. System verification and validation report for the TMAD code

    International Nuclear Information System (INIS)

    This document serves as the Verification and Validation Report for the TMAD code system, which includes the TMAD code and the LIBMAKR Code. The TMAD code was commissioned to facilitate the interpretation of moisture probe measurements in the Hanford Site waste tanks. In principle, the code is an interpolation routine that acts over a library of benchmark data based on two independent variables, typically anomaly size and moisture content. Two additional variables, anomaly type and detector type, can also be considered independent variables, but no interpolation is done over them. The dependent variable is detector response. The intent is to provide the code with measured detector responses from two or more detectors. The code will then interrogate (and interpolate upon) the benchmark data library and find the anomaly-type/anomaly-size/moisture-content combination that provides the closest match to the measured data. The primary purpose of this document is to provide the results of the system testing and the conclusions based thereon. The results of the testing process are documented in the body of the report. Appendix A gives the test plan, including test procedures, used in conducting the tests. Appendix B lists the input data required to conduct the tests, and Appendices C and 0 list the numerical results of the tests

  10. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  11. Verification of component mode techniques for flexible multibody systems

    Science.gov (United States)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  12. Proof Linking: A Modular Verification Archietcture for Mobile Code Systems

    OpenAIRE

    Fong, Philip Wai Leung

    2004-01-01

    This dissertation presents a critical rethinking of the Java bytecode verification architecture from the perspective of a software engineer. In existing commercial implementations of the Java Virtual Machine, there is a tight coupling between the dynamic linking process and the bytecode verifier. This leads to delocalized and interleaving program plans, making the verifier difficult to maintain and comprehend. A modular mobile code verification architecture, called Proof Linking, is proposed....

  13. Monte carlo simulation for designing an explosive-inspection system

    International Nuclear Information System (INIS)

    In order to optimize the design of γ-ray detectors and data analysis of the system for inspection of explosive with associated alpha particle technique, Monte Carlo code EGSnrc was used to simulated detection efficiency and response function of inorganic scintillator detector for γ-rays, aimed at choosing the right type detector. Pulse height spectra of γ-rays of φ5' x 8' NaI(Tl) from graphite, water, ammonium nitrate and simulated explosive induced by 14 MeV neutron were simulated. The calculated results were analyzed and compared with experiments results, demonstrating that simulation method is reliable and it can be used to obtain the database of response function for explosive inspection. (authors)

  14. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  15. The Galileo IOV Dispenser System- Design, Development & Verification

    Science.gov (United States)

    Thompson, S. P.; Andersson, G.; Davies, W.; Plaza, M. A.

    2012-07-01

    On October 21st, 2011, lifting off from the ELS launch site in French Guiana, a Soyuz ST-B and FREGAT upper stage, carried the first two Galileo IOV spacecraft on a 3-hour 49-minute flight and successfully injected the 2 Galileo Navigation spacecraft into a circular medium-Earth orbit. The Dispenser System, the subject of this paper, is the equipped launch vehicle hardware mated directly to the FREGAT upper stage and built specifically to carry 2 Galileo IOV spacecraft during all ground and flight operations up to the moment of separation. The Dispenser System was purposely built for the Galileo IOV missions under European Space Agency and Arianespace contract. The prime contractor was selected to be RUAG Space in Sweden (Linköping) for all Dispenser “System and Management” activities and with subcontracts placed to RUAG Space in Switzerland (Zurich) for the Dispenser “Structure” and EADS CASA Spain (Madrid) for the “Hold Down and Release System” (HRS) hardware. The “Structure” is designed to transfer ground and flight loads between the spacecraft and the Launch Vehicle. The upper part, an aluminium sandwich box-type structure, interfaces with the satellites, whereas the lower part transitions to a lower frame, via a CFRP strut arrangement, to interface with the FREGA T upper stage. The spacecraft separation sub-system is composed of two sets of four low- shock “HRS” units and four “pushers” enabling to firmly hold the satellites during ground and flight operations and to release them when ordered by the Launch Vehicle. The Dispenser System also comprises an electrical sub-system and MLI. This paper summarises the overall Design, Development and Verification activities leading to the Qualification of the Dispenser System hardware. This will include the Structure and HRS contribution to the overall System Qualification. An overview of the System hardware will be described along with DDV logic, some key analysis performed and several of the

  16. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  17. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  18. AVNG SYSTEM SOFTWARE - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated

  19. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  20. Monte Carlo Code System Development for Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Shim, Hyung Jin; Han, Beom Seok; Park, Ho Jin; Park, Dong Gyu [Seoul National University, Seoul (Korea, Republic of)

    2007-03-15

    We have implemented the composition cell class and the use cell to MCCARD for hierarchy input processing. For the inputs of KALlMER-600 core consisted of 336 assemblies, we require the geometric data of 91,056 pin cells. Using hierarchy input processing, it was observed that the system geometries are correctly handled with the geometric data of total 611 cells; 2 cells for fuel rods, 2 cells for guide holes, 271 translation cells for rods, and 336 translation cells for assemblies. We have developed monte carlo decay-chain models based on decay chain model of REBUS code for liquid metal reactor analysis. Using developed decay-chain models, the depletion analysis calculations have performed for the homogeneous and heterogeneous model of KALlMER-600. The k-effective for the depletion analysis agrees well with that of REBUS code. and the developed decay chain models shows more efficient performance for time and memories, as compared with the existing decay chain model The chi-square criterion has been developed to diagnose the temperature convergence for the MC TjH feedback calculations. From the application results to the KALlMER pin and fuel assembly problem, it is observed that the new criterion works well Wc have applied the high efficiency variance reduction technique by splitting Russian roulette to estimate the PPPF of the KALIMER core at BOC. The PPPF of KALlMER core at BOC is 1.235({+-}0.008). The developed technique shows four time faster calculation, as compared with the existin2 calculation Subject Keywords Monte Carlo

  1. Monte Carlo Alpha Iteration Algorithm for a Subcritical System Analysis

    Directory of Open Access Journals (Sweden)

    Hyung Jin Shim

    2015-01-01

    Full Text Available The α-k iteration method which searches the fundamental mode alpha-eigenvalue via iterative updates of the fission source distribution has been successfully used for the Monte Carlo (MC alpha-static calculations of supercritical systems. However, the α-k iteration method for the deep subcritical system analysis suffers from a gigantic number of neutron generations or a huge neutron weight, which leads to an abnormal termination of the MC calculations. In order to stably estimate the prompt neutron decay constant (α of prompt subcritical systems regardless of subcriticality, we propose a new MC alpha-static calculation method named as the α iteration algorithm. The new method is derived by directly applying the power method for the α-mode eigenvalue equation and its calculation stability is achieved by controlling the number of time source neutrons which are generated in proportion to α divided by neutron speed in MC neutron transport simulations. The effectiveness of the α iteration algorithm is demonstrated for two-group homogeneous problems with varying the subcriticality by comparisons with analytic solutions. The applicability of the proposed method is evaluated for an experimental benchmark of the thorium-loaded accelerator-driven system.

  2. Verification of a Monte-Carlo planetary surface radiation environment model using γ-ray data from Lunar Prospector and 2001 Mars Odyssey

    International Nuclear Information System (INIS)

    Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the γ-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include γ-ray spectroscopy, γ-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled γ-ray data is in good agreement with γ-ray data obtained by the γ-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.

  3. Verification of a Monte-Carlo planetary surface radiation environment model using gamma-ray data from Lunar Prospector and 2001 Mars Odyssey

    Energy Technology Data Exchange (ETDEWEB)

    Skidmore, M.S., E-mail: mss16@star.le.ac.u [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom); Ambrosi, R.M. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom)

    2010-01-01

    Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the gamma-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include gamma-ray spectroscopy, gamma-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled gamma-ray data is in good agreement with gamma-ray data obtained by the gamma-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.

  4. Embedded systems handbook embedded systems design and verification

    CERN Document Server

    Zurawski, Richard

    2009-01-01

    Considered a standard industry resource, the Embedded Systems Handbook provided researchers and technicians with the authoritative information needed to launch a wealth of diverse applications, including those in automotive electronics, industrial automated systems, and building automation and control. Now a new resource is required to report on current developments and provide a technical reference for those looking to move the field forward yet again. Divided into two volumes to accommodate this growth, the Embedded Systems Handbook, Second Edition presents a comprehensive view on this area

  5. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    Science.gov (United States)

    2013-01-25

    ... free electronic mail subscription service for industry, trade groups, consumer interest groups, health... Food Safety and Inspection Service Ongoing Equivalence Verifications of Foreign Food Regulatory Systems AGENCY: Food Safety and Inspection Service, USDA. ACTION: Notice. SUMMARY: The Food Safety and...

  6. Results of verifications of the control automatic exposure in equipment of RX with CR systems

    International Nuclear Information System (INIS)

    After the entry into force in 2012, the new Spanish Radiology quality control protocol lists and discusses the results obtained after verification of the automatic control of exposure in computed radiography systems. (Author)

  7. Analytical Methods for Verification and Validation of Adaptive Systems in Safety-Critical Aerospace Applications Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A major challenge of the use of adaptive systems in safety-critical applications is the software life-cycle: requirement engineering through verification and...

  8. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  9. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Data.gov (United States)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  10. A domain specific language and methodology for control systems GUI specification, verification and prototyping

    OpenAIRE

    risoldi, matteo; Buchs, Didier

    2007-01-01

    A work-in-progress domain-specific language and methodology for modeling complex control systems GUIs is presented. MDA techniques are applied for language design and verification, simulation and prototyping.

  11. Verification of the Monte Carlo code RMC with a whole PWR MOX/UO2 core benchmark

    International Nuclear Information System (INIS)

    Several types of V and V work are being carried out for the Reactor Monte Carlo code RMC, including the heterogeneous whole core configurations. In this paper, a whole PWR MOX/UO2 core benchmark which contains both UO2 and MOX assemblies with different enrichments and various burn-up points is chosen to verify RMC's criticality calculation capability, and the results of RMC and other codes are discussed and compared, such as eigenvalues, assembly power distributions, pin power distributions and so on. The discrepancies in eigenvalues and power distributions are satisfactory, which proves the accuracy of RMC's criticality calculation. Also, the influences of different cross-section libraries are discussed upon the results of RMC. Besides these results, the detailed comparisons between RMC and MCNP with the same ENDF/B-VII.0 cross-section library are carried out in this paper, including the comparisons of control rod worths calculated by both RMC and MCNP. According to the results, RMC and MCNP agree quite well in eigenvalues, power distributions and other results. The discrepancies of eigenvalues and control rod worth are fairly small and the relative differences of assembly and pin power distributions are acceptable. All these results contribute to the conclusion that the criticality calculation performance of RMC is accurate and excellent. (author)

  12. Verification of the AECL total system performance models

    International Nuclear Information System (INIS)

    An agreement between Atomic Energy of Canada, Limited (AECL) and the United States Department of Energy (USDOE) defines eight tasks in the study of topics relating to management of radioactive wastes. One task involves the verification of AECL's performance assessment code for their high level waste disposal program. In the agreement this task is given the title: Performance Assessment Technology Exchange. The quality assurance program established by AECL for this code requires that this task be performed. This paper presents an overview of the methods used for code verification and a progress report on the verification of the code. It describes the tools that have been developed to automatically examine code modules, check physical units, and prepare driver routines to exercise every line of code, and verify that it was executed correctly

  13. Verification and Validation in GERAM Framework for Modeling of Information Systems

    OpenAIRE

    Kashefibagherian, Misam

    2011-01-01

    The main aim of this article is to propose a methodology for using verification and validation tools in a framework for modeling of an Industrial Enterprise Information Systems. The first part of this paper introduces the Generalized Enterprise Reference Architecture and Methodology (GERAM) framework and its parts that are used for modeling of industrial enterprise information systems. The second part introduces the verification and validation concepts and tools. The third part of this articl...

  14. System Based on the Platinum Resistance Sensor Metrological Verification

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2012-12-01

    Full Text Available The national weather metering station temperature laboratory to maintain the air temperature professional standards of measurement is the highest standard of Meteorological Department in China, And transferring to the provincial meteorological station. The temperature measurement standard in the Provincial Meteorological Station contains the second standard mercury thermometer, Temperature test tank and Freezing point tank. Carry out the work on Verification and calibration which are on the thermometer and Platinum resistance thermometer in the meteorological field. Introduce the principle of the PT100 platinum thermal resistance. Based on the temperature metrological verification code requirements, tests on 80 platinum resistances, Judge platinum resistance sensor qualified or not.

  15. Monte Carlo simulation of nuclear spin relaxation in disordered system

    International Nuclear Information System (INIS)

    Full text: Nuclear spin relaxation is a very useful technique for obtaining information about diffusion in solids. The present work is motivated by relaxation experiments on H diffusing in disordered systems such as metallic glasses or quasicrystalline materials. A theory of the spectral density functions of the magnetic dipolar interactions between diffusing spins is required in order to relate the experimental data to diffusional parameters. In simple ordered systems, the spectral density functions are well understood and a simple BPP (exponential correlation function) model is often used to interpret the data. Diffusion in disordered systems involves a distribution of activation energies and the simple extension of the BPP model that has been used traditionally is of doubtful validity. A more rigorously based BPP model has been developed, and this model has recently been applied to H diffusion in a metal quasicrystal. The improved BPP model still, however, involves approximations and the accuracy of the parameters deduced from it is not clear. The present work involves a Monte Carlo simulation of diffusion in disordered systems and the calculation of the spectral density functions and relaxation rates. The simulations use two algorithms (discrete time and continuous time) for the time-development of the system, and correctly incorporate the Fermi-Dirac distribution for equilibrium occupation of sites, as required by the principle of detailed balance and only single site occupancy of sites. The results are compared with the BPP models for some site- and barrier-energy distributions arising from the structural disorder of the system. The improved BPP model is found to give reasonable values for the diffusion and disorder parameters. Quantitative estimates of the errors involved are determined

  16. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  17. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  18. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  19. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  20. Verification and validation of rulebased systems for Hubble Space Telescope ground support

    Science.gov (United States)

    Vick, Shon; Lindenmayer, Kelly

    1988-01-01

    As rulebase systems become more widely used in operational environments, the focus is on the problems and concerns of maintaining expert systems. In the conventional software model, the verification and validation of a system have two separate and distinct meanings. To validate a system means to demonstrate that the system does what is advertised. The verification process refers to investigating the actual code to identify inconsistencies and redundancies within the logic path. In current literature regarding maintaining rulebased systems, little distinction is made between these two terms. In fact, often the two terms are used interchangeably. Verification and validation of rulebased systems are discussed as separate but equally important aspects of the maintenance phase. Also described are some of the tools and methods that were developed at the Space Telescope Science Institute to aid in the maintenance of the rulebased system.

  1. A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    OpenAIRE

    Klimek Radosław

    2014-01-01

    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference...

  2. Review of quantum Monte Carlo methods and results for Coulombic systems

    Energy Technology Data Exchange (ETDEWEB)

    Ceperley, D.

    1983-01-27

    The various Monte Carlo methods for calculating ground state energies are briefly reviewed. Then a summary of the charged systems that have been studied with Monte Carlo is given. These include the electron gas, small molecules, a metal slab and many-body hydrogen.

  3. Formulation and Application of Quantum Monte Carlo Method to Fractional Quantum Hall Systems

    OpenAIRE

    Suzuki, Sei; Nakajima, Tatsuya

    2003-01-01

    Quantum Monte Carlo method is applied to fractional quantum Hall systems. The use of the linear programming method enables us to avoid the negative-sign problem in the Quantum Monte Carlo calculations. The formulation of this method and the technique for avoiding the sign problem are described. Some numerical results on static physical quantities are also reported.

  4. Efficiencies of dynamic Monte Carlo algorithms for off-lattice particle systems with a single impurity

    KAUST Repository

    Novotny, M.A.

    2010-02-01

    The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing Markov Chains method are given. Simulation results are presented to confirm the theoretical efficiencies. © 2010.

  5. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  6. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  7. Use of metaknowledge in the verification of knowledge-based systems

    Science.gov (United States)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  8. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...... for creating DSL descriptions, (2) a data validator for checking that DSL descriptions follow the structural rules of the domain, (3) a graphical simulator for simulating the dynamic behaviour of relay interlocking systems, and (4) verification support for deriving and verifying safety properties of...... relay interlocking systems....

  9. Highly Efficient Monte-Carlo for Estimating the Unavailability of Markov Dynamic System1)

    Institute of Scientific and Technical Information of China (English)

    XIAOGang; DENGLi; ZHANGBen-Ai; ZHUJian-Shi

    2004-01-01

    Monte Carlo simulation has become an important tool for estimating the reliability andavailability of dynamic system, since conventional numerical methods are no longer efficient whenthe size of the system to solve is large. However, evaluating by a simulation the probability of oc-currence of very rare events means playing a very large number of histories of the system, whichleads to unacceptable computing time. Highly efficient Monte Carlo should be worked out. In thispaper, based on the integral equation describing state transitions of Markov dynamic system, a u-niform Monte Carlo for estimating unavailability is presented. Using free-flight estimator, directstatistical estimation Monte Carlo is achieved. Using both free-flight estimator and biased proba-bility space of sampling, weighted statistical estimation Monte Carlo is also achieved. Five MonteCarlo schemes, including crude simulation, analog simulation, statistical estimation based oncrude and analog simulation, and weighted statistical estimation, are used for calculating the un-availability of a repairable Con/3/30 : F system. Their efficiencies are compared with each other.The results show the weighted statistical estimation Monte Carlo has the smallest variance and thehighest efficiency in very rare events simulation.

  10. Simulating Strongly Correlated Electron Systems with Hybrid Monte Carlo

    Institute of Scientific and Technical Information of China (English)

    LIU Chuan

    2000-01-01

    Using the path integral representation, the Hubbard and the periodic Anderson model on D-dimensional cubic lattice are transformed into field theories of fermions in D + 1 dimensions. These theories at half-filling possess a positive definite real symmetry fermion matrix and can be simulated using the hybrid Monte Carlo method.

  11. Monte Carlo Radiation Analysis of a Spacecraft Radioisotope Power System

    Science.gov (United States)

    Wallace, M.

    1994-01-01

    A Monte Carlo statistical computer analysis was used to create neutron and photon radiation predictions for the General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS RTG). The GPHS RTG is being used on several NASA planetary missions. Analytical results were validated using measured health physics data.

  12. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  13. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  14. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    Science.gov (United States)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  15. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  16. Quasi-Monte Carlo methods for lattice systems. A first look

    International Nuclear Information System (INIS)

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N-1/2, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N-1. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  17. Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver

    Science.gov (United States)

    Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.

    2010-01-01

    The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.

  18. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    Science.gov (United States)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  19. Subtle Monte Carlo Updates in Dense Molecular Systems

    DEFF Research Database (Denmark)

    Bottaro, Sandro; Boomsma, Wouter; Johansson, Kristoffer E.;

    2012-01-01

    as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results...... suggest our method as a valuable tool in the study of molecules in atomic detail, offering a potential alternative to molecular dynamics for probing long time-scale conformational transitions.......Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce...

  20. 24 CFR 5.233 - Mandated use of HUD's Enterprise Income Verification (EIV) System.

    Science.gov (United States)

    2010-04-01

    ...) Project-based Voucher program under 24 CFR part 983; (v) Project-based Section 8 programs under 24 CFR... noncompliance. Failure to use the EIV system in its entirety may result in the imposition of sanctions and/or... Income Verification (EIV) System. 5.233 Section 5.233 Housing and Urban Development Office of...

  1. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    DEFF Research Database (Denmark)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  2. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  3. New developments of the MCNP/CTF/NEM/NJOY code system – Monte Carlo based coupled code for high accuracy modeling

    International Nuclear Information System (INIS)

    Highlights: ► New coupled Monte Carlo code system for reference results at operating conditions. ► Automated methodology to create and use temperature-dependent cross section libraries. ► Multi-level coupling scheme between MCNP5 and COBRA-TF with different options. ► Acceleration strategy for coupled Monte Carlo calculations including hybrid approach. ► Sensitivity studies on thermal-scattering models and different sub-channel approaches. -- Abstract: High accuracy code systems are necessary to model core environments with considerable geometry complexity and great material heterogeneity. These features are typical of current and innovative nuclear reactor core designs. Advanced methodologies and state-of-the art coupled code systems must be put into practice in order to model with high accuracy these challenging core designs. The presented research comprises the development and implementation of the thermal–hydraulic feedback to the Monte Carlo method and of speed-up mechanisms to accelerate the Monte Carlo criticality calculation. Coupled Monte-Carlo calculations can serve as reference solutions for verifying high-fidelity coupled deterministic neutron transport methods with detailed and accurate thermal–hydraulic models. The development and verification of such reference high-fidelity coupled multi-physics scheme is performed at the Pennsylvania State University (PSU) in cooperation with AREVA, AREVA NP GmbH in Erlangen, Germany, on the basis of MCNP5, NEM, NJOY and COBRA-TF (CTF) computer codes. This paper presents the latest studies and ameliorations developed to this coupled hybrid system, which includes a new methodology for generation and interpolation of Temperature-Dependent Thermal Scattering Cross Section Libraries for MCNP5, a comparison between sub-channel approaches, and acceleration schemes.

  4. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  5. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  6. PROGRESS OF THE AVNG SYSTEM - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency (at) 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs

  7. Safety verification of a fault tolerant reconfigurable autonomous goal-based robotic control system

    OpenAIRE

    Braman, Julia M. B.; Murray, Richard M.; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbo...

  8. Optimization of scintillation-detector timing systems using Monte Carlo analysis

    International Nuclear Information System (INIS)

    Monte Carlo analysis is used to model statistical noise associated with scintillation-detector photoelectron emissions and photomultiplier tube operation. Additionally, the impulse response of a photomultiplier tube, front-end amplifier, and constant-fraction discriminator (CFD) is modeled so the effects of front-end bandwidth and constant-fraction delay and fraction can be evaluated for timing-system optimizations. Such timing-system analysis is useful for detectors having low photo-electron-emission rates, including Bismuth Germanate (BGO) scintillation detectors used in Positron Emission Tomography (PET) systems. Monte Carlo timing resolution for a BGO / photomultiplier scintillation detector, CFD timing system is presented as a function of constant-fraction delay for 511-keV coincident gamma rays in the presence of Compton scatter. Monte Carlo results are in good agreement with measured results when a tri-exponential BGO scintillation model is used. Monte Carlo simulation is extended to include CFD energy-discrimination performance. Monte Carlo energy-discrimination performance is experimentally verified along with timing performance (Monte Carlo timing resolution of 3.22 ns FWHM versus measured resolution of 3.30 ns FWHM) for a front-end rise time of 10 ns (10--90%), CFD delay of 8 ns, and CFD fraction of 20%

  9. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    Science.gov (United States)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  10. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    OpenAIRE

    FAHIM AZIZ UMRANI; AHSAN AHMED URSANI; ABDUL WAHEED UMRANI

    2010-01-01

    This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access) systems, and analyse its performance in terms of the BER (Bit Error Rate). The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain) and unipolar (optical domain) signalling required for Monte-Carlo simulation. The simulated res...

  11. Neutronic calculations for CANDU thorium systems using Monte Carlo techniques

    Science.gov (United States)

    Saldideh, M.; Shayesteh, M.; Eshghi, M.

    2014-08-01

    In this paper, we have investigated the prospects of exploiting the rich world thorium reserves using Canada Deuterium Uranium (CANDU) reactors. The analysis is performed using the Monte Carlo MCNP code in order to understand how much time the reactor is in criticality conduction. Four different fuel compositions have been selected for analysis. We have obtained the infinite multiplication factor, k∞, under full power operation of the reactor over 8 years. The neutronic flux distribution in the full core reactor has already been investigated.

  12. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2009-01-01

    elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting in...... a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for...... automated construction and verification of railway control systems....

  13. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  14. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  15. Novel hybrid Monte Carlo/deterministic technique for shutdown dose rate analyses of fusion energy systems

    International Nuclear Information System (INIS)

    Highlights: •Develop the novel Multi-Step CADIS (MS-CADIS) hybrid Monte Carlo/deterministic method for multi-step shielding analyses. •Accurately calculate shutdown dose rates using full-scale Monte Carlo models of fusion energy systems. •Demonstrate the dramatic efficiency improvement of the MS-CADIS method for the rigorous two step calculations of the shutdown dose rate in fusion reactors. -- Abstract: The rigorous 2-step (R2S) computational system uses three-dimensional Monte Carlo transport simulations to calculate the shutdown dose rate (SDDR) in fusion reactors. Accurate full-scale R2S calculations are impractical in fusion reactors because they require calculating space- and energy-dependent neutron fluxes everywhere inside the reactor. The use of global Monte Carlo variance reduction techniques was suggested for accelerating the R2S neutron transport calculation. However, the prohibitive computational costs of these approaches, which increase with the problem size and amount of shielding materials, inhibit their ability to accurately predict the SDDR in fusion energy systems using full-scale modeling of an entire fusion plant. This paper describes a novel hybrid Monte Carlo/deterministic methodology that uses the Consistent Adjoint Driven Importance Sampling (CADIS) method but focuses on multi-step shielding calculations. The Multi-Step CADIS (MS-CADIS) methodology speeds up the R2S neutron Monte Carlo calculation using an importance function that represents the neutron importance to the final SDDR. Using a simplified example, preliminary results showed that the use of MS-CADIS enhanced the efficiency of the neutron Monte Carlo simulation of an SDDR calculation by a factor of 550 compared to standard global variance reduction techniques, and that the efficiency enhancement compared to analog Monte Carlo is higher than a factor of 10,000

  16. Kinetic Monte Carlo with fields: diffusion in heterogeneous systems

    Science.gov (United States)

    Caro, Jose Alfredo

    2011-03-01

    It is commonly perceived that to achieve breakthrough scientific discoveries in the 21st century an integration of world leading experimental capabilities with theory, computational modeling and high performance computer simulations is necessary. Lying between the atomic and the macro scales, the meso scale is crucial for advancing materials research. Deterministic methods result computationally too heavy to cover length and time scales relevant for this scale. Therefore, stochastic approaches are one of the options of choice. In this talk I will describe recent progress in efficient parallelization schemes for Metropolis and kinetic Monte Carlo [1-2], and the combination of these ideas into a new hybrid Molecular Dynamics-kinetic Monte Carlo algorithm developed to study the basic mechanisms taking place in diffusion in concentrated alloys under the action of chemical and stress fields, incorporating in this way the actual driving force emerging from chemical potential gradients. Applications are shown on precipitation and segregation in nanostructured materials. Work in collaboration with E. Martinez, LANL, and with B. Sadigh, P. Erhart and A. Stukowsky, LLNL. Supported by the Center for Materials at Irradiation and Mechanical Extremes, an Energy Frontier Research Center funded by the U.S. Department of Energy (Award # 2008LANL1026) at Los Alamos National Laboratory

  17. Potential applications of neural networks to verification and validation of complex systems

    International Nuclear Information System (INIS)

    This paper presents conceptual methodology for the verification and validation of complex and integrated human-machine systems and in this context introduces related potential application of an artificial intelligence based information processing paradigm known as artificial neural networks. (author). 6 refs., 1 fig

  18. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  19. Safety verification of non-linear hybrid systems is quasi-decidable

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan

    2014-01-01

    Roč. 44, č. 1 (2014), s. 71-90. ISSN 0925-9856 R&D Projects: GA ČR GCP202/12/J060 Institutional support: RVO:67985807 Keywords : hybrid systems * safety verification * decidability * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 0.875, year: 2014

  20. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    Energy Technology Data Exchange (ETDEWEB)

    ERMI, A.M.

    2000-09-05

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (V&V) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification.

  1. Calculational verification and process control applications utilizing the high sensitivity of noise measurement parameters to fissile system configuration

    International Nuclear Information System (INIS)

    The 252Cf-source-driven noise analysis method has been used in measurements for subcritical configurations of fissile systems for a variety of applications. Measurements and sensitivity studies with the KENO-NR Monte Carlo neutron transport code showed that this measurement method has the potential to monitor many dynamic situations in processing plants such as in casting facilities, in a continuous dissolver, or in batch dissolvers either to be used as a signature to verify that various processes are occurring in a repeatable or bounded way or to obtain the neutron multiplication factor k. This Verification of normal operation would be by comparing noise analysis signatures for the process with reference signatures. Abnormal operation could be ascertained if the signature deviates by some specified amount from the reference signatures from normal operation. The deviations from normal could be specified by measurements or by Monte Carlo neutron transport theory methods directly calculating the measured parameters for the processing plant applications. Measurements with enriched uranyl nitrate solutions are presented as an example to demonstrate the high measured sensitivity of noise-measured parameters. To evaluate this high sensitivity, KENO-NR was used to investigate changes in the noise-measured parameters to variation in fissile system parameters using neutron transport calculations for three aqueous solutions, uranyl nitrate, uranyl fluoride, and plutonium nitrate, and also for an array of light water reactor spent fuel. This high sensitivity has also allowed this measurement method to be used to identify nuclear weapons and/or weapons components in shipping containers by comparing with reference signatures obtained from measurements or calculations and for nondestructive assay of special nuclear materials

  2. New developments of the MCNP/CTF/NEM/NJOY code system - Monte Carlo based coupled code for high accuracy modeling - 277

    International Nuclear Information System (INIS)

    High accuracy code systems are necessary to model core environments with considerable geometry complexity and great material heterogeneity. These features are typical of current and innovative nuclear reactor core designs. Advanced methodologies and state-of-the art coupled code systems must be put into practice in order to model with high accuracy these challenging core designs. The presented research comprises the development and implementation of the thermal-hydraulic feedback to the Monte Carlo method and of speed-up mechanisms to accelerate the Monte Carlo criticality calculation. Coupled Monte-Carlo calculations can serve as reference solutions for verifying high-fidelity coupled deterministic neutron transport methods with detailed and accurate thermal-hydraulic models. The development and verification of such reference high-fidelity coupled multi-physics scheme is performed at the Pennsylvania State University (PSU) in cooperation with AREVA, AREVA NP GmbH in Erlangen, Germany, on the basis of MCNP5, NEM, NJOY and COBRA-TF (CTF) computer codes. This paper presents the latest studies and ameliorations developed to this coupled hybrid system, which includes a new methodology for generation and interpolation of Temperature-Dependent Thermal Scattering Cross Section Libraries for MCNP5, a comparison between sub-channel approaches, and acceleration schemes. (authors)

  3. Development of an analysis software for comparison between proton treatment planning system and Monte Carlo simulation

    International Nuclear Information System (INIS)

    Currently, many proton therapy facilities are used for radiotherapy for treating cancer. The main advantage of proton therapy is the absence of exit dose, which offers a highly conformal dose to treatment target as well as better normal organ sparing. The most of treatment planning system (TPS) in proton therapy calculates dose distribution using a pencil beam algorithm (PBA). PBA is suitable for clinical proton therapy because of the fast computation time. However PBA shows accuracy limitations mainly because of the one-dimensional density scaling of proton pencil beams in water. Recently, we developed Monte Carlo simulation tools for the design of proton therapy facility at National Cancer Center (NCC) using GEANT4 toolkit (version GEANT4.9.2p02). Monte Carlo simulation is expected to reproduce precise influences of complex geometry and material varieties which are difficult to introduce to the PBA. The data format of Monte Carlo simulation result has different from DICOM-RT. Consequently we need we analysis software for comparing between TPS and Monte Carlo simulation. The main objective of this research is to develop an analysis toolkit for verifying precision and accuracy of the proton treatment planning system and to analyze dose calculating algorithm of the proton therapy using Monte Carlo simulation. In this work, we conclude that we developed an analysis software for GEANT4-based medical application. This toolkit is capable of evaluating the accuracy of calculated dose by TPS with Monte Carlo simulation.

  4. Development of an analysis software for comparison between proton treatment planning system and Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Hyun; Suh, Tae Suk [Dept. of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of); Park, Sey Joon; Yoo, Seung Hoon; Lee, Se Byeong [Proton Therapy Center, National Cancer Center, Goyang (Korea, Republic of); Shin, Jung Wook [Dept. of Radiation Oncology, University of California, SanFrancisco (United States)

    2011-11-15

    Currently, many proton therapy facilities are used for radiotherapy for treating cancer. The main advantage of proton therapy is the absence of exit dose, which offers a highly conformal dose to treatment target as well as better normal organ sparing. The most of treatment planning system (TPS) in proton therapy calculates dose distribution using a pencil beam algorithm (PBA). PBA is suitable for clinical proton therapy because of the fast computation time. However PBA shows accuracy limitations mainly because of the one-dimensional density scaling of proton pencil beams in water. Recently, we developed Monte Carlo simulation tools for the design of proton therapy facility at National Cancer Center (NCC) using GEANT4 toolkit (version GEANT4.9.2p02). Monte Carlo simulation is expected to reproduce precise influences of complex geometry and material varieties which are difficult to introduce to the PBA. The data format of Monte Carlo simulation result has different from DICOM-RT. Consequently we need we analysis software for comparing between TPS and Monte Carlo simulation. The main objective of this research is to develop an analysis toolkit for verifying precision and accuracy of the proton treatment planning system and to analyze dose calculating algorithm of the proton therapy using Monte Carlo simulation. In this work, we conclude that we developed an analysis software for GEANT4-based medical application. This toolkit is capable of evaluating the accuracy of calculated dose by TPS with Monte Carlo simulation.

  5. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  6. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z; Thomas, A; Newton, J; Ibbott, G; Deasy, J; Oldham, M, E-mail: Zhiheng.wang@duke.ed

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm{sup 3}. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  7. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    Science.gov (United States)

    Wang, Z.; Thomas, A.; Newton, J.; Ibbott, G.; Deasy, J.; Oldham, M.

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  8. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    International Nuclear Information System (INIS)

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  9. Advances in SVM-Based System Using GMM Super Vectors for Text-Independent Speaker Verification

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jian; DONG Yuan; ZHAO Xianyu; YANG Hao; LU Liang; WANG Haila

    2008-01-01

    For text-independent speaker verification,the Gaussian mixture model (GMM) using a universal background model strategy and the GMM using support vector machines are the two most commonly used methodologies.Recently,a new SVM-based speaker verification method using GMM super vectors has been proposed.This paper describes the construction of a new speaker verification system and investigates the use of nuisance attribute projection and test normalization to further enhance performance.Experiments were conducted on the core test of the 2006 NIST speaker recognition evaluation corpus.The experimental results indicate that an SVM-based speaker verification system using GMM super vectors can achieve ap-pealing performance.With the use of nuisance attribute projection and test normalization,the system per-formance can be significantly improved,with improvements in the equal error rate from 7.78% to 4.92% and detection cost function from 0.0376 to 0.0251.

  10. The factorization method for Monte Carlo simulations of systems with a complex with

    Science.gov (United States)

    Ambjørn, J.; Anagnostopoulos, K. N.; Nishimura, J.; Verbaarschot, J. J. M.

    2004-03-01

    We propose a method for Monte Carlo simulations of systems with a complex action. The method has the advantages of being in principle applicable to any such system and provides a solution to the overlap problem. In some cases, like in the IKKT matrix model, a finite size scaling extrapolation can provide results for systems whose size would make it prohibitive to simulate directly.

  11. Synthetic Stimuli for the Steady-State Verification of Modulation-Based Noise Reduction Systems

    Directory of Open Access Journals (Sweden)

    Jesko G. Lamm

    2009-01-01

    Full Text Available Hearing instrument verification involves measuring the performance of noise reduction systems. Synthetic stimuli are proposed as test signals, because they can be tailored to the parameter space of the noise reduction system under test. The article presents stimuli targeted at steady-state measurements in modulation-based noise reduction systems. It shows possible applications of these stimuli and measurement results obtained with an exemplary hearing instrument.

  12. A Tool for Automatic Verification of Real-Time Expert Systems

    Science.gov (United States)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  13. A Classifier Fusion System with Verification Module for Improving Recognition Reliability

    OpenAIRE

    Zhang, Ping

    2010-01-01

    In this paper, we proposed a novel classifier fusion system to congregate the recognition results of an ANN classifier and a modified KNN classifier. The recognition results are verified by the recognition results of SVM. As two entirely different classification techniques (image-based OCR and 1-D digital signal SVM classification) are applied to the system, experiments have demonstrated that the proposed classifier fusion system with SVM verification module can significantly increase the sys...

  14. Verification and disarmament

    International Nuclear Information System (INIS)

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  15. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  16. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10-4 to 10-5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined keff answer was given with the standard deviation and three confidence intervals that contained the analytic keff. To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined keff confidence intervals for these deliberately ill-posed problems did not include the analytic keff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that the

  17. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    Science.gov (United States)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this

  18. Development of CANDU spent fuel verification system using optical fiber scintillator

    International Nuclear Information System (INIS)

    In CANDU, spent fuels discharge 16∼24 bundles from the reactor core at everyday. Those are contained on the tray and the tray is stacked in the spent fuel. Currently, the Agency uses the CANDU Bundle Verifier for Stack (CBVS). It consists of a CZT gamma spectrometric probe which moves vertically along the space in between the columns of trays. Somewhat, spent fuel verification by non-destructive assay has been implemented for safeguards purpose using various radiation detectors such as a gas type detector, a semiconductor detector and so on. However, due to the severe circumstance of spent fuel storage such as high temperature, high radiation intensity and difficult to access area, the applicable radiation detectors and measurement techniques are very limited. An optical fiber scintillator has been known to have a good radiation hardness and physical properties for high temperature and humidity. In order to verify spent fuels stored in difficult to access area, KINAC designed and developed a prototype which was a spent fuel verification equipment using an optical fiber scintillator. The field test was performed at Wolsung NPP (Nuclear Power Plant) pond storage area. And this system will be made an entry for IAEA's verification equipment. For registration, KINAC would be followed the IAEA's QA procedure. At now, KINAC/IAEA developed the user, functional requirement and design specification for System, Hardware and Software separately. After finishing the procedure, it will be used for verification of spent fuel in lieu of CANDU Bundle Verifier for Baskets (CBVS). (author)

  19. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  20. Simulation-based design process for the verification of ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, Romain, E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, FI-33101 Tampere (Finland); Määttä, Timo; Siuko, Mikko [VTT Technical Research Centre of Finland, P.O. Box 1300, FI-33101 Tampere (Finland); Mattila, Jouni [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland)

    2014-10-15

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability.

  1. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)

    2015-05-15

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.

  2. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    International Nuclear Information System (INIS)

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations

  3. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    International Nuclear Information System (INIS)

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at the

  4. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  5. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre;

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified as a...... separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system, the...... problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  6. Measurement and verification of load shifting interventions for a fridge plant system in South Africa

    OpenAIRE

    Gouws, Rupert

    2013-01-01

    In this paper, the author presents the measurement and verification methodology used to quantify the impacts of load shifting measures that are implemented on large industrial fridge plant systems in South Africa. A summary on the operation of fridge plant systems and the data typically available for baseline development is provided. The author discusses issues surrounding baseline development and service level adjustments for the following two scenarios: 1) the electrical data is available f...

  7. Method of γ-peak removal in coincidence measurement system for verification of nuclear arms control

    International Nuclear Information System (INIS)

    The paper analyzes the influence of γ-rays on coincidence measurement system performance according to the principle of coincidence measurement in verification technology of nuclear arms control, and proposes the refusing window method to remove γ-peak. Experiments show that this method can significantly reduce the γ-peak, and the removal rate of γ-rays reaches 99%. Thus the refusing window method improves the precision of coincidence measurement system. (authors)

  8. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    OpenAIRE

    José Meseguer; Peter Csaba Ölveczky

    2010-01-01

    Distributed embedded systems (DESs) are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonom...

  9. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  10. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... European standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  11. Monte Carlo simulations of lattice models for single polymer systems

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, Hsiao-Ping, E-mail: hsu@mpip-mainz.mpg.de [Max-Planck-Institut für Polymerforschung, Ackermannweg 10, D-55128 Mainz (Germany)

    2014-10-28

    Single linear polymer chains in dilute solutions under good solvent conditions are studied by Monte Carlo simulations with the pruned-enriched Rosenbluth method up to the chain length N∼O(10{sup 4}). Based on the standard simple cubic lattice model (SCLM) with fixed bond length and the bond fluctuation model (BFM) with bond lengths in a range between 2 and √(10), we investigate the conformations of polymer chains described by self-avoiding walks on the simple cubic lattice, and by random walks and non-reversible random walks in the absence of excluded volume interactions. In addition to flexible chains, we also extend our study to semiflexible chains for different stiffness controlled by a bending potential. The persistence lengths of chains extracted from the orientational correlations are estimated for all cases. We show that chains based on the BFM are more flexible than those based on the SCLM for a fixed bending energy. The microscopic differences between these two lattice models are discussed and the theoretical predictions of scaling laws given in the literature are checked and verified. Our simulations clarify that a different mapping ratio between the coarse-grained models and the atomistically realistic description of polymers is required in a coarse-graining approach due to the different crossovers to the asymptotic behavior.

  12. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  13. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  14. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  15. Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection

    International Nuclear Information System (INIS)

    Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection. The present research was done for verification of previous developed method on Loss of Coolant Accident (LOCA) detection and perform simulations for knowing the sensitivity of the PWR monitoring system that applied neuro-expert method. The previous research continuing on present research, has developed and has tested the neuro-expert method for several anomaly detections in Nuclear Power Plant (NPP) typed Pressurized Water Reactor (PWR). Neuro-expert can detect the LOCA anomaly with sensitivity of primary coolant leakage of 7 gallon/min and the conventional method could not detect the primary coolant leakage of 30 gallon/min. Neuro expert method detects significantly LOCA anomaly faster than conventional system in Surry-1 NPP as well so that the impact risk is reducible. (author)

  16. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    International Nuclear Information System (INIS)

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  17. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    Science.gov (United States)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  18. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  19. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  20. Abstractions for Fault-Tolerant Distributed System Verification

    Science.gov (United States)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  1. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system to...

  2. Quasi-Monte Carlo methods for lattice systems. A first look

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Leovey, H.; Griewank, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Mathematik; Nube, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Mueller-Preussker, M. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2013-02-15

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N{sup -1/2}, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N{sup -1}. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  3. Monte Carlo-derived TLD cross-calibration factors for treatment verification and measurement of skin dose in accelerated partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Garnica-Garza, H M [Centro de Investigacion y de Estudios Avanzados del Instituto Politecnico Nacional Unidad Monterrey, VIa del Conocimiento 201 Parque de Investigacion e Innovacion Tecnologica, Apodaca NL C.P. 66600 (Mexico)], E-mail: hgarnica@cinvestav.mx

    2009-03-21

    Monte Carlo simulation was employed to calculate the response of TLD-100 chips under irradiation conditions such as those found during accelerated partial breast irradiation with the MammoSite radiation therapy system. The absorbed dose versus radius in the last 0.5 cm of the treated volume was also calculated, employing a resolution of 20 {mu}m, and a function that fits the observed data was determined. Several clinically relevant irradiation conditions were simulated for different combinations of balloon size, balloon-to-surface distance and contents of the contrast solution used to fill the balloon. The thermoluminescent dosemeter (TLD) cross-calibration factors were derived assuming that the calibration of the dosemeters was carried out using a Cobalt 60 beam, and in such a way that they provide a set of parameters that reproduce the function that describes the behavior of the absorbed dose versus radius curve. Such factors may also prove to be useful for those standardized laboratories that provide postal dosimetry services.

  4. Monte Carlo-derived TLD cross-calibration factors for treatment verification and measurement of skin dose in accelerated partial breast irradiation

    International Nuclear Information System (INIS)

    Monte Carlo simulation was employed to calculate the response of TLD-100 chips under irradiation conditions such as those found during accelerated partial breast irradiation with the MammoSite radiation therapy system. The absorbed dose versus radius in the last 0.5 cm of the treated volume was also calculated, employing a resolution of 20 μm, and a function that fits the observed data was determined. Several clinically relevant irradiation conditions were simulated for different combinations of balloon size, balloon-to-surface distance and contents of the contrast solution used to fill the balloon. The thermoluminescent dosemeter (TLD) cross-calibration factors were derived assuming that the calibration of the dosemeters was carried out using a Cobalt 60 beam, and in such a way that they provide a set of parameters that reproduce the function that describes the behavior of the absorbed dose versus radius curve. Such factors may also prove to be useful for those standardized laboratories that provide postal dosimetry services.

  5. Development of Monte Carlo decay gamma-ray transport calculation system

    International Nuclear Information System (INIS)

    In the DT fusion reactor, it is critical concern to evaluate the decay gamma-ray biological dose rates after the reactor shutdown exactly. In order to evaluate the decay gamma-ray biological dose rates exactly, three dimensional Monte Carlo decay gamma-ray transport calculation system have been developed by connecting the three dimensional Monte Carlo particle transport calculation code and the induced activity calculation code. The developed calculation system consists of the following four functions. (1) The operational neutron flux distribution is calculated by the three dimensional Monte Carlo particle transport calculation code. (2) The induced activities are calculated by the induced activity calculation code. (3) The decay gamma-ray source distribution is obtained from the induced activities. (4) The decay gamma-rays are generated by using the decay gamma-ray source distribution, and the decay gamma-ray transport calculation is conducted by the three dimensional Monte Carlo particle transport calculation code. In order to reduce the calculation time drastically, a biasing system for the decay gamma-ray source distribution has been developed, and the function is also included in the present system. In this paper, the outline and the detail of the system, and the execution example are reported. The evaluation for the effect of the biasing system is also reported. (author)

  6. A complementary dual-modality verification for tumor tracking on a gimbaled linac system

    International Nuclear Information System (INIS)

    Background and purpose: For dynamic tracking of moving tumors, robust intra-fraction verification was required, to assure that tumor motion was properly managed during the course of radiotherapy. A dual-modality verification system, consisting of an on-board orthogonal kV and planar MV imaging device, was validated and applied retrospectively to patient data. Methods and materials: Real-time tumor tracking (RTTT) was managed by applying PAN and TILT angular corrections to the therapeutic beam using a gimbaled linac. In this study, orthogonal X-ray imaging and MV EPID fluoroscopy was acquired simultaneously. The tracking beam position was derived from respectively real-time gimbals log files and the detected field outline on EPID. For both imaging modalities, the moving target was localized by detection of an implanted fiducial. The dual-modality tracking verification was validated against a high-precision optical camera in phantom experiments and applied to clinical tracking data from a liver and two lung cancer patients. Results: Both verification modalities showed a high accuracy (<0.3 mm) during validation on phantom. Marker detection on EPID was influenced by low image contrast. For the clinical cases, gimbaled tracking showed a 90th percentile error (E90) of 3.45 (liver), 2.44 (lung A) and 3.40 mm (lung B) based on EPID fluoroscopy and good agreement with XR-log file data by an E90 of 3.13, 1.92 and 3.33 mm, respectively, during beam on. Conclusion: Dual-modality verification was successfully implemented, offering the possibility of detailed reporting on RTTT performance

  7. Verification and validation of new operation supports systems for Beznau NPP

    International Nuclear Information System (INIS)

    This article describes the activities associated with the Verification and Validation works performed by Tecnatom on a computerised Advanced Alarm System (AAS) and a Computer Based Procedure System (CBP), for the licensing of these systems to be used in the control rooms of Beznau NPP (property of NOK). In this process Tecnatom acted as an independent company in the evaluation of the new systems, supporting Beznau NPP to obtain the approval from the HSK (Swiss Federal Nuclear Safety Inspectorate) for the implementation of these systems into the training and operating concepts of the plant. (Author)

  8. Nuclear instrumentation. Liquid-scintillation systems. Performance verification

    International Nuclear Information System (INIS)

    The standard is designed to serve as a tool for assessing the operability of typical liquid scintillation systems. The operability is assessed in terms of the detection efficiency of the system and reproducibility of the background and sample pulse frequencies. (P.A.)

  9. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  10. Development of computerized patient setup verification and correction system in radiotherapy

    International Nuclear Information System (INIS)

    Visual comparison of a reference image with a verification image is commonly used for setup verification in external beam radiation therapy. However, it sometimes lacks reproducibility and provides insufficient quantitative evidence. The present study was performed to develop computerized methods for determining landmarks to verify a portal image with digital reconstruction radiograph (DRR), and to investigate the clinical effectiveness of our method. Our computer algorithm consisted of three main procedures-preprocessing, determination of landmarks, and verification-none of which required manual operation. Finally, our system indicated the distance for setup correction. We evaluated the accuracy of our system using pelvic phantom images, and the maximum magnitude of error was shown to be 1.12 (n=9). The results indicated that the error range of our system was sufficiently small to examine patient positioning error, which should be less than 5 mm, as described in AAPM report TG40. Our system will aid operators in positioning patients accurately for external radiation therapy. (author)

  11. Risk Analysis of Tilapia Recirculating Aquaculture Systems: A Monte Carlo Simulation Approach

    OpenAIRE

    Kodra, Bledar

    2007-01-01

    Risk Analysis of Tilapia Recirculating Aquaculture Systems: A Monte Carlo Simulation Approach Bledar Kodra (ABSTRACT) The purpose of this study is to modify an existing static analytical model developed for a Re-circulating Aquaculture Systems through incorporation of risk considerations to evaluate the economic viability of the system. In addition the objective of this analysis is to provide a well documented risk based analytical system so that individuals (investors/lenders) c...

  12. Monte Carlo study of a high-sensitivity gamma-ray detection system

    International Nuclear Information System (INIS)

    The authors use Monte Carlo calculations to study a new design for a high-sensitivity gamma-ray detection system. The system uses an array of high-purity germanium detectors operating with an event-mode data acquisition system. The calculations show that the proposed design could produce a factor of 10 increase in the sensitivity of these measurements compared to currently employed systems

  13. Sampling and verification methods for the uncertainty analysis of NDA and NDE waste characterization systems

    International Nuclear Information System (INIS)

    Use of nondestructive assay (NDA) and evaluation (NDE) systems in critical waste characterization requires a realistic assessment of the uncertainty in the measurements. The stated uncertainty must include potential effects of a variety of complicating external factors on the expected bias and precision. These factors include material heterogeneity (matrix effects), fluctuating background levels, and other variable operating conditions. Uncertainty figures from application of error propagation methods to data from controlled laboratory experiments using standard test materials can grossly underestimate the expected error. This paper reviews the standard error propagation method of uncertainty analysis, discusses some of its limitations, and presents an alternative approach based on sampling and verification. Examples of application of sampling and verification methods to measurement systems at INEL are described

  14. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    International Nuclear Information System (INIS)

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  15. VERIFICATION OF TORSIONAL OSCILLATING MECHANICAL SYSTEM DYNAMIC CALCULATION RESULTS

    Directory of Open Access Journals (Sweden)

    Peter KAŠŠAY

    2014-09-01

    Full Text Available On our department we deal with optimization and tuning of torsional oscillating mechanical systems. When solving these problems we often use results of dynamic calculation. The goal of this article is to compare values obtained by computation and experimentally. For this purpose, a mechanical system built in our laboratory was used. At first, classical HARDY type flexible coupling has been applied into the system, then we used a pneumatic flexible shaft coupling developed by us. The main difference of these couplings over conventional flexible couplings is that they can change their dynamic properties during operation, by changing the pressure of the gaseous medium in their flexible elements.

  16. A Verification and Validation Tool for Diagnostic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  17. ASSIMILATION SYSTEM AT DHMZ: DEVELOPMENT AND FIRST VERIFICATION RESULTS

    OpenAIRE

    Stanešić, Antonio

    2011-01-01

    Abstract: In this paper, a description of the setup for a local assimilation system for a limited area model, ALADIN (Aire Limiteé Adaptation Dynamique dévelopement InterNational), is given with a comprehensive description of the assimilation techniques used. The assimilation system at DHMZ (Meteorological and Hydrological Service of Croatia) consisted of two parts: the surface assimilation, which was used to change the state of a model land surface variables, and the upper air assimilatio...

  18. Advanced orbiting systems test-bedding and protocol verification

    Science.gov (United States)

    Noles, James; De Gree, Melvin

    1989-01-01

    The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.

  19. Quantum Mechanics and locality in the K0 K-bar0 system experimental verification possibilities

    International Nuclear Information System (INIS)

    It is shown that elementary Quantum Mechanics, applied to the K0 K-bar0 system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K0 K-bar0 experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs

  20. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    OpenAIRE

    Joseph, S.; Herold, M; Sunderlin, W.D.; L. V. Verchot

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three ca...

  1. Revisiting the security of speaker verification systems against imposture using synthetic speech

    OpenAIRE

    De Leon, P.L.; Apsingekar, V. R.; Pucher, M.; Yamagishi, J

    2010-01-01

    In this paper, we investigate imposture using synthetic speech. Although this problem was first examined over a decade ago, dramatic improvements in both speaker verification (SV) and speech synthesis have renewed interest in this problem. We use a HMM-based speech synthesizer which creates synthetic speech for a targeted speaker through adaptation of a background model. We use two SV systems: standard GMMUBM- based and a newer SVM-based. Our results show when the syst...

  2. THE APPLICATION OF COLOURED PETRI NETS TO VERIFICATION OF DISTRIBUTED SYSTEMS SPECIFIED BY MESSAGE SEQUENCE CHARTS

    OpenAIRE

    CHERNENOK S.A.; NEPOMNIASCHY V.A.

    2015-01-01

    The language of message sequence charts (MSC) is a popular scenario-based specification language used to describe the interaction of components in distributed systems. However, the methods for validation of MSC diagrams are underdeveloped. This paper describes a method for translation of MSC diagrams into coloured Petri nets (CPN). The method is applied to the property verification of these diagrams. The considered set of diagram elements is extended by the elements of UML sequence diagrams a...

  3. Crew Exploration Vehicle Potable Water System Verification Description

    Science.gov (United States)

    Tuan, George; Peterson, Laurie J.; Vega, Leticia M.

    2010-01-01

    A stored water system on the crew exploration vehicle (CEV) will supply the crew with potable water for: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain the quality of the water, transferred from the orbiter to the International Space Station, stored in contingency water containers. In the CEV water system, a depletion of the ionic silver biocide is expected due to ionic silver-plating onto the surfaces of materials within the CEV water system, thus negating its effectiveness as a biocide. Because this may be the first time NASA is considering a stored water system for long-term missions that do not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point-of-use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform the testing that will help alleviate concerns related to the CEV water system.

  4. A new DNB design method using the system moment method combined with Monte Carlo simulation

    International Nuclear Information System (INIS)

    A new statistical method of core thermal design for pressurized water reactors is presented. It not only quantifies the DNBR parameter uncertainty by the system moment method, but also combines the DNBR parameter with correlation uncertainty using Monte Carlo technique. The randomizing function for Monte Carlo simulation was expressed in a form of reciprocal-multiplication of DNBR parameter and correlation uncertainty factors. The results of comparisons with the conventional methods show that the DNBR limit calculated by this method is in good agreement with that by the SCU method with less computational effort and it is considered applicable to the current DNB design

  5. Systematic study of finite-size effects in quantum Monte Carlo calculations of real metallic systems

    Energy Technology Data Exchange (ETDEWEB)

    Azadi, Sam, E-mail: s.azadi@imperial.ac.uk; Foulkes, W. M. C. [Department of Physics, Imperial College London, Exhibition Road, London SW7 2AZ (United Kingdom)

    2015-09-14

    We present a systematic and comprehensive study of finite-size effects in diffusion quantum Monte Carlo calculations of metals. Several previously introduced schemes for correcting finite-size errors are compared for accuracy and efficiency, and practical improvements are introduced. In particular, we test a simple but efficient method of finite-size correction based on an accurate combination of twist averaging and density functional theory. Our diffusion quantum Monte Carlo results for lithium and aluminum, as examples of metallic systems, demonstrate excellent agreement between all of the approaches considered.

  6. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    Science.gov (United States)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  7. Dynamic Isotope Power System: technology verification phase. Test plan. 79-KIPS-6

    International Nuclear Information System (INIS)

    The objective of this document is to outline the test plan for the KIPS Technology Verification Program. This test plan is inclusive of component simulating (rig) testing, component testing and system testing. Rig testing will prove concept feasibility, measure basic performance and to develop the hardware necessary prior to initiation of GDS component part manufacture. Component testing will measure basic performance and verify component integrity prior to GDS assembly. The GDS system testing will: simulate the flight system operation; determine the life limiting components; measure performance and relate to potential system lifetime; demonstrate 18+% DC generating efficiency; and perform a 5000 h endurance test with final configuration hardware

  8. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed ...... safety properties are verified for the SystemC model by means of bounded model checking. (2) The object code is verified to be I/O behaviourally equivalent to the SystemC model from which it was compiled....

  9. Acceptance and implementation of a system of planning computerized based on Monte Carlo

    International Nuclear Information System (INIS)

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  10. Stability of few-body systems and quantum Monte-Carlo methods

    International Nuclear Information System (INIS)

    Quantum Monte-Carlo methods are well suited to study the stability of few-body systems. Their capabilities are illustrated by studying the critical stability of the hydrogen molecular ion whose nuclei and electron interact through the Yukawa potential, and the stability of small helium clusters. Refs. 16 (author)

  11. Refinement and Verification of Real-Time Systems

    CERN Document Server

    Kolano, Paul Z; Kemmerer, Richard A; Mandrioli, Dino

    2010-01-01

    This paper discusses highly general mechanisms for specifying the refinement of a real-time system as a collection of lower level parallel components that preserve the timing and functional requirements of the upper level specification. These mechanisms are discussed in the context of ASTRAL, which is a formal specification language for real-time systems. Refinement is accomplished by mapping all of the elements of an upper level specification into lower level elements that may be split among several parallel components. In addition, actions that can occur in the upper level are mapped to actions of components operating at the lower level. This allows several types of implementation strategies to be specified in a natural way, while the price for generality (in terms of complexity) is paid only when necessary. The refinement mechanisms are first illustrated using a simple digital circuit; then, through a highly complex phone system; finally, design guidelines gleaned from these specifications are presented.

  12. Verification of uranium 238 quantity calculated using waste assay systems

    International Nuclear Information System (INIS)

    The amount of 238U in uranium-contaminated waste drums generated in the decommissioning of nuclear facilities is evaluated from γ-ray measurement. We used the γ-ray measurement system made from CANBERRA(Qualitative and Quantitative (Q2) Low Level Waste Assay Systems) and measured the waste drums. This system assumes uniform distribution of uranium. But, homogeneity can not be checked with real waste drums. Authors developed the new analysis technique which calculates the amount of uranium by correcting the influence of uneven distribution of the uranium. As a result of evaluating using the new analysis technique, the error which influences quantitative value of 238U has been evaluated. (author)

  13. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  14. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  15. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....

  16. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  17. Verification of the model of a photon beam of 6 MV in a Monte Carlo planning comparison with collapsed cone in homogeneous medium

    International Nuclear Information System (INIS)

    We evaluated the Monte Carlo Monaco Planner v2.0.3 by planners of the SEFM Protocol [1] to the modeling of the photon beam of 6 MV of a linear accelerator Elekta Synergy with collimator MLC Beam Modulator. We compare the Monte Carlo calculation with profiles on water measurement DFS = 100 cm, absorbed dose and dose levels for rectangular, asymmetric fields and different DFS. We compare the results with those obtained with the algorithm Collapsed Cone of Pinnacle Scheduler v8.0m. (Author)

  18. Dosimetric Verification Using Monte Carlo Calculations for Tissue Heterogeneity-Corrected Conformal Treatment Plans Following RTOG 0813 Dosimetric Criteria for Lung Cancer Stereotactic Body Radiotherapy

    International Nuclear Information System (INIS)

    Purpose: The recently activated Radiation Therapy Oncology Group (RTOG) studies of stereotactic body radiation therapy (SBRT) for non-small-cell lung cancer (NSCLC) require tissue density heterogeneity correction, where the high and intermediate dose compliance criteria were established based on superposition algorithm dose calculations. The study was aimed at comparing superposition algorithm dose calculations with Monte Carlo (MC) dose calculations for SBRT for NSCLC and to evaluate whether compliance criteria need to be adjusted for MC dose calculations. Methods and Materials: Fifteen RTOG 0236 study sets were used. The planning tumor volumes (PTV) ranged from 10.7 to 117.1 cm3. SBRT conformal treatment plans were generated using XiO (CMS Inc.) treatment planning software with superposition algorithm to meet the dosimetric high and intermediate compliance criteria recommended by the RTOG 0813 protocol. Plans were recalculated using the MC algorithm of a Monaco (CMS, Inc.) treatment planning system. Tissue density heterogeneity correction was applied in both calculations. Results: Overall, the dosimetric quantities of the MC calculations have larger magnitudes than those of the superposition calculations. On average, R100% (ratio of prescription isodose volume to PTV), R50% (ratio of 50% prescription isodose volume to PTV), D2cm (maximal dose 2 cm from PTV in any direction as a percentage of prescription dose), and V20 (percentage of lung receiving dose equal to or larger than 20 Gy) increased by 9%, 12%, 7%, and 18%, respectively. In the superposition plans, 3 cases did not meet criteria for R50% or D2cm. In the MC-recalculated plans, 8 cases did not meet criteria for R100%, R50%, or D2cm. After reoptimization with MC calculations, 5 cases did not meet the criteria for R50% or D2cm. Conclusions: Results indicate that the dosimetric criteria, e.g., the criteria for R50% recommended by RTOG 0813 protocol, may need to be adjusted when the MC dose calculation

  19. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  20. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  1. Simulated coal gas MCFC power plant system verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  2. Nondestructive verification and assay systems for spent fuels

    International Nuclear Information System (INIS)

    This is an interim report of a study concerning the potential application of nondestructive measurements on irradiated light-water-reactor (LWR) fuels at spent-fuel storage facilities. It describes nondestructive measurement techniques and instruments that can provide useful data for more effective in-plant nuclear materials management, better safeguards and criticality safety, and more efficient storage of spent LWR fuel. In particular, several nondestructive measurement devices are already available so that utilities can implement new fuel-management and storage technologies for better use of existing spent-fuel storage capacity. The design of an engineered prototype in-plant spent-fuel measurement system is approx. 80% complete. This system would support improved spent-fuel storage and also efficient fissile recovery if spent-fuel reprocessing becomes a reality

  3. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    based on the Stålmarck algorithm. While some requirements are easily proved, others are virtually impossible to manage du to a very large potenbtial state space. We present what has been done in order to get, at least, an idea of whether or not such difficult requirements are fulfilled or not, and we...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  4. Dosimetric verification of a commercial inverse treatment planning system

    Science.gov (United States)

    Xing, Lei; Curran, Bruce; Hill, Robert; Holmes, Tim; Ma, Lijun; Forster, Kenneth M.; Boyer, Arthur L.

    1999-02-01

    A commercial three-dimensional (3D) inverse treatment planning system, Corvusimages/0031-9155/44/2/013/img10.gif" ALIGN="TOP"/> (Nomos Corporation, Sewickley, PA), was recently made available. This paper reports our preliminary results and experience with commissioning this system for clinical implementation. This system uses a simulated annealing inverse planning algorithm to calculate intensity-modulated fields. The intensity-modulated fields are divided into beam profiles that can be delivered by means of a sequence of leaf settings by a multileaf collimator (MLC). The treatments are delivered using a computer-controlled MLC. To test the dose calculation algorithm used by the Corvus software, the dose distributions for single rectangularly shaped fields were compared with water phantom scan data. The dose distributions predicted to be delivered by multiple fields were measured using an ion chamber that could be positioned in a rotatable cylindrical water phantom. Integrated charge collected by the ion chamber was used to check the absolute dose of single- and multifield intensity modulated treatments at various spatial points. The measured and predicted doses were found to agree to within 4% at all measurement points. Another set of measurements used a cubic polystyrene phantom with radiographic film to record the radiation dose distribution. The films were calibrated and scanned to yield two-dimensional isodose distributions. Finally, a beam imaging system (BIS) was used to measure the intensity-modulated x-ray beam patterns in the beam's-eye view. The BIS-measured images were then compared with a theoretical calculation based on the MLC leaf sequence files to verify that the treatment would be executed accurately and without machine faults. Excellent correlation (correlation coefficients images/0031-9155/44/2/013/img11.gif" ALIGN="TOP"/>) was found for all cases. Treatment plans generated using intensity-modulated beams appear to be suitable for treatment of

  5. The SAMS: Smartphone Addiction Management System and verification.

    Science.gov (United States)

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  6. VMAT planning and verification of delivery and dosimetry using the 3-D delta4 dosimetry system

    International Nuclear Information System (INIS)

    VMAT can provide advantages over IMRT and 3DCRT, by reducing treatment delivery time and total number of monitor units, as well as improving dose conformity. VMAT plans produced using Oncentra MasterPlan (Nucletron) and delivered with an Elekta Synergy linac were evaluated. Verification dosimetry measurements were earned out as part of a pre-clinical commissioning programme, using a Semiflex ionisation chamber in a CIRS humanoid pelvis phantom, and using a ScandiDos Delta system. The Delta incorporates two orthogonal 2D arrays of semiconductor diodes, enabling assessment of delivered dose distributions in 3D. These techniques have been previously verified during an on-going development of IMRT pre-treatment verification dosimetry that has evolved from film to the Delta arrays

  7. Integrated testing and verification system for research flight software design document

    Science.gov (United States)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  8. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  9. PET-COMPTON System. Comparative evaluation with PET System using Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Positron Emission Tomography (PET) in small animals has actually achieved spatial resolution round about 1 mm and currently there are under study different approaches to improve this spatial resolution. One of them combines PET technology with Compton Cameras. This paper presents the idea of the so called PET-Compton systems and has included comparative evaluation of spatial resolution and global efficiency in both PET and PET-Compton system by means of Monte Carlo simulations using Geant4 code. Simulation was done on a PET-Compton system made-up of LYSO-LuYAP scintillating detectors of particular small animal PET scanner named Clear-PET and for Compton detectors based on CdZnTe semiconductor. A group of radionuclides that emits a positron (e+) and quantum almost simultaneously and fulfills some selection criteria for their possible use in PET-Compton systems for medical and biological applications were studied under simulation conditions. By means of analytical reconstruction using SSRB (Single Slide Rebinning) method were obtained superior spatial resolution in PET-Compton system for all tested radionuclides (reaching sub-millimeter values of for 22Na source). However this analysis done by simulation have shown limited global efficiency values in PET-Compton system (in the order of 10-5-10-6 %) instead of values around 5*10-1 % that have been achieved in PET system. (author)

  10. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    Science.gov (United States)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  11. Preliminary Verification Calculation of DeCART/CAPP System by HTTR Core Analysis

    International Nuclear Information System (INIS)

    In this study, the DeCART/CAPP system verification calculations have been performed against the Japan's HTTR (High Temperature Engineering Test Reactor) configurations. The calculations are carried out for single cell and single block models. The reference calculations are performed by the McCARD code. The two step core analysis system HELIOS/CAPP or DeCART/CAPP has been developed for VHTR core analysis by KAERI. In the system, first the HELIOS or DeCART code is used for homogenized cross-section generation, and second the CAPP is used to calculate the core physics parameters

  12. Preliminary Verification Calculation of DeCART/CAPP System by HTTR Core Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Chang Joon; Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In this study, the DeCART/CAPP system verification calculations have been performed against the Japan's HTTR (High Temperature Engineering Test Reactor) configurations. The calculations are carried out for single cell and single block models. The reference calculations are performed by the McCARD code. The two step core analysis system HELIOS/CAPP or DeCART/CAPP has been developed for VHTR core analysis by KAERI. In the system, first the HELIOS or DeCART code is used for homogenized cross-section generation, and second the CAPP is used to calculate the core physics parameters.

  13. Integration of model verification, validation, and calibration for uncertainty quantification in engineering systems

    International Nuclear Information System (INIS)

    This paper proposes a Bayesian methodology to integrate model verification, validation, and calibration activities for the purpose of overall uncertainty quantification in different types of engineering systems. The methodology is first developed for single-level models, and then extended to systems that are studied using multi-level models that interact with each other. Two types of interactions amongst multi-level models are considered: (1) Type-I, where the output of a lower-level model (component and/or subsystem) becomes an input to a higher level system model, and (2) Type-II, where parameters of the system model are inferred using lower-level models and tests (that describe simplified components and/or isolated physics). The various models, their inputs, parameters, and outputs, experimental data, and various sources of model error are connected through a Bayesian network. The results of calibration, verification, and validation with respect to each individual model are integrated using the principles of conditional probability and total probability, and propagated through the Bayesian network in order to quantify the overall system-level prediction uncertainty. The proposed methodology is illustrated with numerical examples that deal with heat conduction and structural dynamics. - Author-Highlights: • A Bayesian approach is used to integrate verification, validation, and calibration. • Single-level models and systems with multiple component-level models are presented. • System-level configurations with two types of model-interactions are considered. • A Bayesian network connects multiple models, inputs, parameters, and outputs. • Total probability theorem is used to quantify system-level prediction uncertainty

  14. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  15. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  16. Guidelines for the verification and validation of expert system software and conventional software. Volume 3: Survey and documentation of expert system verification and validation methods. Final report

    International Nuclear Information System (INIS)

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  17. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  18. Verification of ARES transport code system with TAKEDA benchmarks

    International Nuclear Information System (INIS)

    Neutron transport modeling and simulation are central to many areas of nuclear technology, including reactor core analysis, radiation shielding and radiation detection. In this paper the series of TAKEDA benchmarks are modeled to verify the critical calculation capability of ARES, a discrete ordinates neutral particle transport code system. SALOME platform is coupled with ARES to provide geometry modeling and mesh generation function. The Koch–Baker–Alcouffe parallel sweep algorithm is applied to accelerate the traditional transport calculation process. The results show that the eigenvalues calculated by ARES are in excellent agreement with the reference values presented in NEACRP-L-330, with a difference less than 30 pcm except for the first case of model 3. Additionally, ARES provides accurate fluxes distribution compared to reference values, with a deviation less than 2% for region-averaged fluxes in all cases. All of these confirms the feasibility of ARES–SALOME coupling and demonstrate that ARES has a good performance in critical calculation

  19. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  20. Robust control design verification using the modular modeling system

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem.

  1. Verification of ARES transport code system with TAKEDA benchmarks

    Science.gov (United States)

    Zhang, Liang; Zhang, Bin; Zhang, Penghe; Chen, Mengteng; Zhao, Jingchang; Zhang, Shun; Chen, Yixue

    2015-10-01

    Neutron transport modeling and simulation are central to many areas of nuclear technology, including reactor core analysis, radiation shielding and radiation detection. In this paper the series of TAKEDA benchmarks are modeled to verify the critical calculation capability of ARES, a discrete ordinates neutral particle transport code system. SALOME platform is coupled with ARES to provide geometry modeling and mesh generation function. The Koch-Baker-Alcouffe parallel sweep algorithm is applied to accelerate the traditional transport calculation process. The results show that the eigenvalues calculated by ARES are in excellent agreement with the reference values presented in NEACRP-L-330, with a difference less than 30 pcm except for the first case of model 3. Additionally, ARES provides accurate fluxes distribution compared to reference values, with a deviation less than 2% for region-averaged fluxes in all cases. All of these confirms the feasibility of ARES-SALOME coupling and demonstrate that ARES has a good performance in critical calculation.

  2. Application of the peregrine Monte Carlo dose calculation system to stereotactic radiosurgery

    International Nuclear Information System (INIS)

    Purpose/Objective: This work describes the capability to perform Monte Carlo dose calculations for stereotactic radiosurgery within the framework of the PEREGRINE dose calculation system. A future study will use this capability to assess the clinical benefits to this technique of higher accuracy in dose calculation. Materials and Methods: PEREGRINE is a first-principles 3D Monte Carlo dose calculation system for clinical radiation therapy treatment planning (RTP) systems. By taking advantage of recent advances in low-cost computer commodity hardware, modern symmetric multiprocessor architectures and state-of-the-art Monte Carlo transport algorithms, PEREGRINE performs high-resolution (1 mm), high accuracy, Monte Carlo RTP calculations in times that are reasonable for clinical use (< 30 minutes.) The PEREGRINE source model provides a compact, accurate representation of the radiation source and the effects of beam modifiers. Our experience in implementing blocks, wedges, and static MLC ports in PEREGRINE as beam modifiers provides physics models that accurately reproduce the transmitted and scattered fluence at the patient surface. Adapting PEREGRINE to calculate stereotactic radiosurgery dose distributions requires extending the PEREGRINE source model to include stereotactic apertures and treatment arcs. The physics models used for other modifiers will accurately determine stereotactic aperture effects. We only need to provide a new geometry module to describe the physical properties of the apertures. Treatment arcs are easily implemented as a probability distribution in beam direction as a function of delivered dose. Results: A comparison of results from PEREGRINE calculations and experimental measurements made at the University of Wisconsin/Madison is presented. The distribution of direct, transmitted and scattered radiation and the resulting contributions to dose from stereotactic apertures are shown. The accuracy and calculational efficiency of the physics

  3. Acceptance and implementation of a system of planning computerized based on Monte Carlo; Aceptacion y puesta en marcha de un sistema de planificacion comutarizada basado en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-07-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  4. Fuzzy Controllers for a Gantry Crane System with Experimental Verifications

    Directory of Open Access Journals (Sweden)

    Naif B. Almutairi

    2016-01-01

    Full Text Available The control problem of gantry cranes has attracted the attention of many researchers because of the various applications of these cranes in the industry. In this paper we propose two fuzzy controllers to control the position of the cart of a gantry crane while suppressing the swing angle of the payload. Firstly, we propose a dual PD fuzzy controller where the parameters of each PD controller change as the cart moves toward its desired position, while maintaining a small swing angle of the payload. This controller uses two fuzzy subsystems. Then, we propose a fuzzy controller which is based on heuristics. The rules of this controller are obtained taking into account the knowledge of an experienced crane operator. This controller is unique in that it uses only one fuzzy system to achieve the control objective. The validity of the designed controllers is tested through extensive MATLAB simulations as well as experimental results on a laboratory gantry crane apparatus. The simulation results as well as the experimental results indicate that the proposed fuzzy controllers work well. Moreover, the simulation and the experimental results demonstrate the robustness of the proposed control schemes against output disturbances as well as against uncertainty in some of the parameters of the crane.

  5. A Markov Chain Monte Carlo Based Method for System Identification

    Energy Technology Data Exchange (ETDEWEB)

    Glaser, R E; Lee, C L; Nitao, J J; Hanley, W G

    2002-10-22

    This paper describes a novel methodology for the identification of mechanical systems and structures from vibration response measurements. It combines prior information, observational data and predictive finite element models to produce configurations and system parameter values that are most consistent with the available data and model. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The resulting process enables the estimation of distributions of both individual parameters and system-wide states. Attractive features of this approach include its ability to: (1) provide quantitative measures of the uncertainty of a generated estimate; (2) function effectively when exposed to degraded conditions including: noisy data, incomplete data sets and model misspecification; (3) allow alternative estimates to be produced and compared, and (4) incrementally update initial estimates and analysis as more data becomes available. A series of test cases based on a simple fixed-free cantilever beam is presented. These results demonstrate that the algorithm is able to identify the system, based on the stiffness matrix, given applied force and resultant nodal displacements. Moreover, it effectively identifies locations on the beam where damage (represented by a change in elastic modulus) was specified.

  6. Dynamic phase transitions in a ferromagnetic thin film system: A Monte Carlo simulation study

    International Nuclear Information System (INIS)

    Dynamic phase transition properties of a ferromagnetic thin film system under the influence of both bias and time-dependent magnetic fields have been elucidated by means of kinetic Monte Carlo simulation with local spin update Metropolis algorithm. The obtained results after a detailed analysis suggest that the bias field is the conjugate field to dynamic order parameter, and it also appears to define a phase line between two antiparallel dynamic ordered states depending on the considered system parameters. Moreover, the data presented in this study well qualitatively reproduce the recently published experimental findings where time-dependent magnetic behavior of a uniaxial cobalt films is studied in the neighborhood of dynamic phase transition point. - Highlights: • A ferromagnetic thin film system is examined. • The system is exposed to both bias and time-dependent magnetic fields. • Kinetic Monte Carlo simulation technique is used. • Bias field is the conjugate field to the dynamic order parameter

  7. HERMES - a Monte Carlo program system for beam-materials interaction studies

    International Nuclear Information System (INIS)

    HERMES (High Energy Radiation Monte Carlo Elaborate System) is a system of Monte-Carlo computer codes that are necessary to treat the different physics to be considered in computer simulation of radiation transport and interaction problems. The HERMES collection of physics programs permits the simulation of secondary particle histories induced by primary particles of any energy up to the regime of high-energy physics and down to thermal energies, e.g. for neutrons. The particles, that are considered by the programs of the HERMES system are p, n, π+, π-, π0, π±, e+, e-, γ, and light ions to A=10. The programs of the HERMES system have been taken as original codes as far as possible. To satisfy the needs of some applications, extensions and changes became necessary. Also the interfacing technique by HERMES submission files needs some additional programming. All changes made to the original codes are documented. (orig./DG)

  8. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  9. Systems guide to MCNP (Monte Carlo Neutron and Photon Transport Code)

    International Nuclear Information System (INIS)

    The subject of this report is the implementation of the Los Alamos National Laboratory Monte Carlo Neutron and Photon Transport Code - Version 3 (MCNP) on the different types of computer systems, especially the IBM MVS system. The report supplements the documentation of the RSIC computer code package CCC-200/MCNP. Details of the procedure to follow in executing MCNP on the IBM computers, either in batch mode or interactive mode, are provided

  10. MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Gabriela Ižaríková

    2015-12-01

    Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.

  11. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  12. Verification of operational weather forecasts from the POSEIDON system across the Eastern Mediterranean

    Directory of Open Access Journals (Sweden)

    A. Papadopoulos

    2009-07-01

    Full Text Available The POSEIDON weather forecasting system became operational at the Hellenic Centre for Marine Research (HCMR in October 1999. The system with its nesting capability provided 72-h forecasts in two different model domains, i.e. 25- and 10-km grid spacing. The lower-resolution domain covered an extended area that included most of Europe, Mediterranean Sea and N. Africa, while the higher resolution domain focused on the Eastern Mediterranean. A major upgrade of the system was recently implemented in the framework of the POSEIDON-II project (2005–2008. The aim was to enhance the forecasting skill of the system through improved model parameterization schemes and advanced numerical techniques for assimilating available observations to produce high resolution analysis fields. The configuration of the new system is applied on a horizontal resolution of 1/20°×1/20° (~5 km covering the Mediterranean basin, Black Sea and part of North Atlantic providing up to 5-day forecasts. This paper reviews and compares the current with the previous weather forecasting systems at HCMR presenting quantitative verification statistics from the pre-operational period (from mid-November 2007 to October 2008. The statistics are based on verification against surface observations from the World Meteorological Organization (WMO network across the Eastern Mediterranean region. The results indicate that the use of the new system can significantly improve the weather forecasts.

  13. Verification of secure distributed systems in higher order logic: A modular approach using generic components

    Energy Technology Data Exchange (ETDEWEB)

    Alves-Foss, J.; Levitt, K.

    1991-01-01

    In this paper we present a generalization of McCullough's restrictiveness model as the basis for proving security properties about distributed system designs. We mechanize this generalization and an event-based model of computer systems in the HOL (Higher Order Logic) system to prove the composability of the model and several other properties about the model. We then develop a set of generalized classes of system components and show for which families of user views they satisfied the model. Using these classes we develop a collection of general system components that are instantiations of one of these classes and show that the instantiations also satisfied the security property. We then conclude with a sample distributed secure system, based on the Rushby and Randell distributed system design and designed using our collection of components, and show how our mechanized verification system can be used to verify such designs. 16 refs., 20 figs.

  14. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  15. Proceedings 11th International Workshop on Automated Specification and Verification of Web Systems

    DEFF Research Database (Denmark)

    2015-01-01

    These proceedings contain the papers presented at the 11th International Workshop on Automated Specification and Verification of Web Systems (WWV 2015), which was held on 23 June 2015 in Oslo, Norway, as a satellite workshop of the 20th International Symposium on Formal Methods (FM 2015). WWV is a...... yearly interdisciplinary forum for researchers originating from the following areas: declarative, rule-based programming, formal methods, software engineering and web-based systems. The workshop fosters the cross-fertilisation and advancement of hybrid methods from such areas....

  16. Research on database realization technology of seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Developing CTBT verification technology has become the most important method that makes sure CTBT to be fulfilled conscientiously. The seismic analysis based on seismic information system (SIS) is playing an important rule in this field. Based on GIS, the SIS will be very sufficient and powerful in spatial analysis, topologic analysis and visualization. However, the critical issue to implement the whole system function depends on the performance of SIS DB. Based on the ArcSDE Geodatabase data model, not only have the spatial data and attribute data seamless integrated management been realized with RDBMS ORACLE really, but also the most functions of ORACLE have been reserved. (authors)

  17. Verification of hyperbolicity for attractors of some mechanical systems with chaotic dynamics

    Science.gov (United States)

    Kuznetsov, Sergey P.; Kruglov, Vyacheslav P.

    2016-03-01

    Computer verification of hyperbolicity is provided based on statistical analysis of the angles of intersection of stable and unstable manifolds for mechanical systems with hyperbolic attractors of Smale-Williams type: (i) a particle sliding on a plane under periodic kicks, (ii) interacting particles moving on two alternately rotating disks, and (iii) a string with parametric excitation of standing-wave patterns by a modulated pump. The examples are of interest as contributing to filling the hyperbolic theory of dynamical systems with physical content.

  18. A capability of the ultrasonic inspection system - part of RPV ISI 1995/2000 verification

    International Nuclear Information System (INIS)

    The evaluation of the Ultrasonic Inspection System has been performed at 'Kozloduy' NPP, unit 1 during the ISI 1999/2000 of the RPV which is also a contribution to the Technical Justification. Calibration blocks 02A and 02 are examined and the method uncertainties have been evaluated and found to be acceptable by the Inspection Qualification Centre for the applied procedure. The calibration of the ultrasonic Inspection System contributes to verification of the inspection results for RPV, unit 1 obtained in 2000 vs 1995 and assessment of the possible flaw growth in the particular operation period

  19. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    Science.gov (United States)

    Mowlavi, Ali Asghar; Hadizadeh Yazdi, Mohammad Hadi

    2011-12-01

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with 241Am-9Be source and NaI(Tl) detector to obtain the distortion due to “pile-up” in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  20. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    Energy Technology Data Exchange (ETDEWEB)

    Mowlavi, Ali Asghar, E-mail: amowlavi@sttu.ac.ir [Physics Department, School of Sciences, Sabzevar Tarbiat Moallem University, Sabzevar (Iran, Islamic Republic of); TRIL, ICTP, Trieste (Italy); Hadizadeh Yazdi, Mohammad Hadi [Physics Department, School of Sciences, Ferdowsi University of Mashhad, Mashhad (Iran, Islamic Republic of)

    2011-12-21

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with {sup 241}Am-{sup 9}Be source and NaI(Tl) detector to obtain the distortion due to 'pile-up' in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  1. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    International Nuclear Information System (INIS)

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with 241Am-9Be source and NaI(Tl) detector to obtain the distortion due to “pile-up” in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  2. Multi-level Monte Carlo for stochastically modeled chemical kinetic systems

    CERN Document Server

    Anderson, David F

    2011-01-01

    A chemical reaction network involves multiple reactions and species. The simplest stochastic models of such networks treat the system as a continuous time Markov chain with the state being the number of molecules of each species and with reactions modeled as possible transitions of the chain. While there are methods that generate exact sample paths of the Markov chain, their computational cost scales linearly with the number of reaction events. Therefore, such methods become computationally intense for even moderately sized systems. This drawback is greatly exacerbated when such simulations are performed in conjunction with Monte Carlo techniques, as is the norm, which require the generation of many paths. We show how to extend a recently proposed multi-level Monte Carlo approach to this stochastic chemical kinetic setting, lowering the computational complexity needed to compute expected values of functions of the state of the system to a specified accuracy. The extension is non-trivial and a novel coupling o...

  3. Kinetic Monte Carlo and Cellular Particle Dynamics Simulations of Multicellular Systems

    CERN Document Server

    Flenner, Elijah; Barz, Bogdan; Neagu, Adrian; Forgacs, Gabor; Kosztin, Ioan

    2011-01-01

    Computer modeling of multicellular systems has been a valuable tool for interpreting and guiding in vitro experiments relevant to embryonic morphogenesis, tumor growth, angiogenesis and, lately, structure formation following the printing of cell aggregates as bioink particles. Computer simulations based on Metropolis Monte Carlo (MMC) algorithms were successful in explaining and predicting the resulting stationary structures (corresponding to the lowest adhesion energy state). Here we introduce two alternatives to the MMC approach for modeling cellular motion and self-assembly: (1) a kinetic Monte Carlo (KMC), and (2) a cellular particle dynamics (CPD) method. Unlike MMC, both KMC and CPD methods are capable of simulating the dynamics of the cellular system in real time. In the KMC approach a transition rate is associated with possible rearrangements of the cellular system, and the corresponding time evolution is expressed in terms of these rates. In the CPD approach cells are modeled as interacting cellular ...

  4. Penelope - A code system for Monte Carlo simulation of electron and photon transport

    International Nuclear Information System (INIS)

    The computer code system PENELOPE (version 2001) performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV to about 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A geometry package called PENGEOM permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the PENELOPE code system, but also to provide the user with the necessary information to understand the details of the Monte-Carlo algorithm. (authors)

  5. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  6. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  7. Audio-Visual Based Multi-Sample Fusion to Enhance Correlation Filters Speaker Verification System

    Directory of Open Access Journals (Sweden)

    Dzati Athiar Ramli

    2010-07-01

    Full Text Available In this study, we propose a novel approach for speaker verification system that uses a spectrogram image as features and Unconstrained Minimum Average Correlation Energy (UMACE filters as classifiers. Since speech signal is a behavioral signal, the speech data has a tendency not to consistently reproduce due to the change of speaking rates, health, emotional conditions, temperature and humidity. In order to overcome this problem, a modification of UMACE filters architecture is proposed by executing a multi-sample fusion using speech and lipreading data. So as to evaluate the outstanding fusion scheme, five multisample fusion strategies, i.e. maximum, minimum, median, average and majority vote are first experimented using thespeech signal data. Afterward, the performance of the audiovisualsystem using the enhanced UMACE filters is then tested. Here, lipreading data is combined to the audio samples pool and the outstanding fusion scheme that found in prior experiment is used as multi-sample fusion scheme. The Digit Database had been used for performance evaluation and the performance up to 99.64% is achieved by using the enhanced UMACE filters for the speech only system which is 6.89% improvement compared with the base line approach. Subsequently, the implementation of the audio-visual system is observed to be significant in order to broaden the PSR score interval between the authentic and imposter data as well as to further improve the performance of audio only system that offer toward a robust verification system.

  8. Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo

    International Nuclear Information System (INIS)

    Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)

  9. Verification and testing of the RTOS for safety-critical embedded systems

    International Nuclear Information System (INIS)

    Development in Instrumentation and Control (I and C) technology provides more convenience and better performance, thus, adopted in many fields. To adopt newly developed technology, nuclear industry requires rigorous V and V procedure and tests to assure reliable operation. Adoption of digital system requires verification and testing of the OS for licensing. Commercial real-time operating system (RTOS) is targeted to apply to various, unpredictable needs, which makes it difficult to verify. For this reason, simple, application-oriented realtime OS is developed for the nuclear application. In this work, we show how to verify the developed RTOS at each development lifecycle. Commercial formal tool is used in specification and verification of the system. Based on the developed model, software in C language is automatically generated. Tests are performed for two purposes; one is to identify consistency between the verified model and the generated code, the other is to find errors in the generated code. The former assumes that the verified model is correct, and the latter incorrect. Test data are generated separately to satisfy each purpose. After we test the RTOS software, we implement the test board embedded with the developed RTOS and the application software, which simulates the safety critical plant protection function. Testing to identify whether the reliability criteria is satisfied or not is also designed in this work. It results in that the developed RTOS software works well when it is embedded in the system

  10. Monte Carlo simulation of a digital coincidence system applied to 60Co standardization

    International Nuclear Information System (INIS)

    The Laboratorio de Metrologia Nuclear (LMN) at the Instituto de Pesquisas Energeticas e Nucleares (IPEN) is developing a Digital Coincidence System (DCS), including the design of the proper acquisition electronics and analysis software. A brief discussion about the measurement methodology and the electronics operation is presented. This work is focused on the results of the designed software (the Monte Carlo simulation of 60Co decay data and the Coincidence Data Analysis), which are in good agreement with the experimental data. (author)

  11. A Monte Carlo computer program for analysis of backscattering and sputtering in practical vacuum systems

    International Nuclear Information System (INIS)

    A Monte Carlo computer program originally developed for analysis of molecular gas flow in axi-symmetric vacuum systems has been extended to include modelling of high energy backscattering and sputtering processes. This report describes the input data required by the computer program together with the results produced. A general description is given of the program operation and the backscattering and sputtering modelling used. An example calculation is included to illustrate practical application of the program. (author)

  12. PARTICLE SWARM OPTIMIZATION OF SOLAR CENTRAL RECEIVER SYSTEMS FROM A MONTE CARLO DIRECT MODEL

    OpenAIRE

    Farges, Olivier; Bézian, Jean-Jacques; El Hafi, Mouna; Fudym, Olivier; Bru, Hélène

    2013-01-01

    Considering the investment needed to build a solar concentrating facility, the performance of such an installation has to be maximized. This is the reason why the preliminary design step is one of the most important stage of the project process. This paper presents an optimization approach coupling a Particle Swarm Optimization algorithm with a Monte Carlo algorithm applied to the design of Central Receiver Solar systems. After the validation of the direct model from experimental data, severa...

  13. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    Science.gov (United States)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  14. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  15. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  16. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    International Nuclear Information System (INIS)

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs

  17. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    Science.gov (United States)

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  18. RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems

    Science.gov (United States)

    Pecheur, Charles; Visser, Willem; Simmons, Reid

    2001-01-01

    The long-term future of space exploration at NASA is dependent on the full exploitation of autonomous and adaptive systems: careful monitoring of missions from earth, as is the norm now, will be infeasible due to the sheer number of proposed missions and the communication lag for deep-space missions. Mission managers are however worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries and hence we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation (V&V) of software systems. The dual purpose of the meeting was to: (1) make NASA engineers aware of the V&V techniques they could be using; and (2) make the V&V community aware of the complexity of the systems NASA is developing.

  19. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  20. A total verification and validation test of the instrument and control system for the Ohi Power Station's PWR No. 3

    International Nuclear Information System (INIS)

    The Corporation has completed verification and validation of an instrumentation and control system for pressurized water reactor No. 3 at Ohi Power Station. The state-of-the-art system, developed and manufactured by Mitsubishi Electric, employs the latest in device, system, user-interface, fault-diagnosis, and simulation technologies. Each subsystem was linked to a sophisticated plant simulator during verification, and the tests included complicated interactions between system components. The results satisfied the specifications. On-site testing was simplified by verifying the performance of the signal-interface units in the factory. (author)

  1. Verification of Monte Carlo transport codes against measured small angle p-, d-, and t-emission in carbon fragmentation at 600 MeV/nucleon

    Energy Technology Data Exchange (ETDEWEB)

    Abramov, B. M. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Alekseev, P. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Borodin, Yu. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Bulychjov, S. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Dukhovskoy, I. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Krutenkova, A. P. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Martemianov, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Matsyuk, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Turdakina, E. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Khanov, A. I. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Mashnik, Stepan Georgievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-03

    Momentum spectra of hydrogen isotopes have been measured at 3.5° from 12C fragmentation on a Be target. Momentum spectra cover both the region of fragmentation maximum and the cumulative region. Differential cross sections span five orders of magnitude. The data are compared to predictions of four Monte Carlo codes: QMD, LAQGSM, BC, and INCL++. There are large differences between the data and predictions of some models in the high momentum region. The INCL++ code gives the best and almost perfect description of the data.

  2. Investigation of gamma and neutron energy fluences in iron-water benchmark configurations for the verification of Monte Carlo calculations and their application in reactor material dosimetry

    International Nuclear Information System (INIS)

    Recent findings indicate that gamma radiation can contribute to the embrittlement of reactor materials. On this background an experimental benchmark programme at two low power reactors was started to measure both, neutron and gamma spectral fluences behind and inside of transmission modules consisting of variable iron and water slabs using a NE213 scintillation spectrometer and partly a HPGe detector. The experimental results are used to validate Monte Carlo calculation methods for coupled neutron/gamma problems. The experiment and results of a first series of measurements and comparisons to MCNP calculations for neutron and gamma energy spectra are presented. (author)

  3. Evaluation of IMRT plans of prostate carcinoma from four treatment planning systems based on Monte Carlo

    International Nuclear Information System (INIS)

    Objective: With the Monte Carlo method to recalculate the IMRT dose distributions from four TPS to provide a platform for independent comparison and evaluation of the plan quality.These results will help make a clinical decision as which TPS will be used for prostate IMRT planning. Methods: Eleven prostate cancer cases were planned with the Corvus, Xio, Pinnacle and Eclipse TPS. The plans were recalculated by Monte Carlo using leaf sequences and MUs for individual plans. Dose-volume-histograms and isodose distributions were compared. Other quantities such as Dmin (the minimum dose received by 99% of CTV/PTV), Dmax (the maximum dose received by 1% of CTV/PTV), V110%, V105%, V95% (the volume of CTV/PTV receiving 110%, 105%, 95% of the prescription dose), the volume of rectum and bladder receiving >65 Gy and >40 Gy, and the volume of femur receiving >50 Gy were evaluated. Total segments and MUs were also compared. Results: The Monte Carlo results agreed with the dose distributions from the TPS to within 3%/3 mm. The Xio, Pinnacle and Eclipse plans show less target dose heterogeneity and lower V65 and V40 for the rectum and bladder compared to the Corvus plans. The PTV Dmin is about 2 Gy lower for Xio plans than others while the Corvus plans have slightly lower female head V50 (0.03% and 0.58%) than others. The Corvus plans require significantly most segments (187.8) and MUs (1264.7) to deliver and the Pinnacle plans require fewest segments (82.4) and MUs (703.6). Conclusions: We have tested an independent Monte Carlo dose calculation system for dose reconstruction and plan evaluation. This system provides a platform for the fair comparison and evaluation of treatment plans to facilitate clinical decision making in selecting a TPS and beam delivery system for particular treatment sites. (authors)

  4. Development and preliminary verification of the PWR on-line core monitoring software system. SOPHORA

    International Nuclear Information System (INIS)

    This paper presents an introduction to the development and preliminary verification of a new on-line core monitoring software system (CMSS), named SOPHORA, for fixed in-core detector (FID) system of PWR. Developed at China General Nuclear Power Corporation (CGN), SOPHORA integrates CGN’s advanced PWR core simulator COCO and thermal-hydraulic sub-channel code LINDEN to manage the real-time core calculation and analysis. Currents measured by the FID are re-evaluated and used as bases to reconstruct the 3-D core power distribution. The key parameters such as peak local power margin and minimum DNBR margin are obtained by comparing with operation limits. Pseudo FID signals generated by data from movable in-core detector (MID) are used to verify the SOPHORA system. Comparison between predicted power peak and the responding MID in-core flux map results shows that the SOPHORA results are reasonable and satisfying. Further verification and validation of SOPHORA is undergoing and will be reported later. (author)

  5. Performance test results of the advanced verification for inventory sample system (AVIS) (2)

    International Nuclear Information System (INIS)

    The Advanced Verification for Inventory sample System (AVIS) is a nondestructive assay (NDA) system to verify the plutonium mass with high accuracy in the small plutonium uranium mixed oxide (MOX) powder and pellet samples at Japan Nuclear Fuel Limited MOX fuel fabrication plant (J-MOX) under construction. The AVIS was designed by Los Alamos National Laboratory (LANL) under the auspices of the Secretariat of Nuclear Regulation Authority. The AVIS will fulfill an important role in the safeguards approach for J-MOX because the AVIS will be used as a verification tool instead of destructive analysis for a part of the samples for bias defect. Japan Atomic Energy Agency (JAEA), which has experience and knowledge to develop the NDA systems and plutonium handling fields, was entrusted with performance testing for the AVIS by the Nuclear Material Control Center of Japan. As a first phase, JAEA conducted the performance tests by using 252Cf neutron sources. In this test, neutron detection efficiency, die-away time and other detector properties were evaluated, and it was confirmed that the AVIS had the performance designed by LANL. As a second phase, JAEA conducted the performance tests by using MOX samples. In this test, the measurement accuracy of the AVIS was evaluated. From the results, it was confirmed that the AVIS could almost satisfy the performance requirements on the accuracy by IAEA. (author)

  6. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  7. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  8. Comparing Subspace Methods for Closed Loop Subspace System Identification by Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    David Di Ruscio

    2009-10-01

    Full Text Available A novel promising bootstrap subspace system identification algorithm for both open and closed loop systems is presented. An outline of the SSARX algorithm by Jansson (2003 is given and a modified SSARX algorithm is presented. Some methods which are consistent for closed loop subspace system identification presented in the literature are discussed and compared to a recently published subspace algorithm which works for both open as well as for closed loop data, i.e., the DSR_e algorithm as well as the bootstrap method. Experimental comparisons are performed by Monte Carlo simulations.

  9. Formal Modeling and Verification of Context-Aware Systems using Event-B

    Directory of Open Access Journals (Sweden)

    Hong Anh Le

    2014-12-01

    Full Text Available Context awareness is a computing paradigm that makes applications responsive and adaptive with their environment. Formal modeling and verification of context-aware systems are challenging issues in the development as they are complex and uncertain. In this paper, we propose an approach to use a formal method Event-B to model and verify such systems. First, we specify a context aware system’s components such as context data entities, context rules, context relations by Event-B notions. In the next step, we use the Rodin platform to verify the system’s desired properties such as context constraint preservation. It aims to benefit from natural representation of context awareness concepts in Event-B and proof obligations generated by refinement mechanism to ensure the correctness of systems. We illustrate the use of our approach on a scenario of an Adaptive Cruise Control system.

  10. A verification and validation methodology for expert systems in nuclear power applications

    International Nuclear Information System (INIS)

    The potential for expert system applications in the nuclear power industry is widely recognized. The benefits of these systems include the retention of specialized human expertise, improved equipment reliability through enhanced diagnostics, and consistency of reasoning during off-normal situations when operators are under great stress. However, before any of these benefits can be realized in critical nuclear power applications a careful and comprehensive Verification and Validation (V and V) program must be applied to ensure the quality of the application. This paper provides a summary of a methodology for the V and V of expert systems developed for nuclear power applications. The similarities and differences of expert system and conventional software techniques are identified and analyzed, and conventional V and V approaches are advocated where applicable

  11. IMRT verification with a camera-based electronic portal imaging system

    International Nuclear Information System (INIS)

    An evaluation of the capabilities of a commercially available camera-based electronic portal imaging system for intensity-modulated radiotherapy verification is presented. Two modifications to the system are demonstrated which use a novel method to tag each image acquired with the delivered dose measured by the linac monitor chamber and reduce optical cross-talk in the imager. A detailed performance assessment is presented, including measurements of the optical decay characteristics of the system. The overall geometric accuracy of the system is determined to be ±2.0 mm, with a dosimetric accuracy of ±1.25 MU. Finally a clinical breast IMRT treatment, delivered by dynamic multileaf collimation, is successfully verified both by tracking the position of each leaf during beam delivery and recording the integrated intensity observed over the entire beam. (author)

  12. Monte Carlo simulations of star clusters - II. Tidally limited, multi-mass systems with stellar evolution

    CERN Document Server

    Giersz, M

    2000-01-01

    A revision of Stod\\{'o}{\\l}kiewicz's Monte Carlo code is used to simulate evolution of large star clusters. A survey of the evolution of N-body systems influenced by the tidal field of a parent galaxy and by stellar evolution is presented. The results presented are in good agreement with theoretical expectations and the results of other methods (Fokker-Planck, Monte Carlo and N-body). The initial rapid mass loss, due to stellar evolution of the most massive stars, causes expansion of the whole cluster and eventually leads to the disruption of less bound systems ($W_0=3$). Models with larger $W_0$ survive this phase of evolution and then undergo core collapse and subsequent post-collapse expansion, like isolated models. The expansion phase is eventually reversed when tidal limitation becomes important. The results presented are the first major step in the direction of simulating evolution of real globular clusters by means of the Monte Carlo method.

  13. Clinical use of a commercial Monte Carlo treatment planning system for electron beams

    International Nuclear Information System (INIS)

    In 2002 we fully implemented clinically a commercial Monte Carlo based treatment planning system for electron beams. The software, developed by MDS Nordion (presently Nucletron), is based on Kawrakow's VMC++ algorithm. The Monte Carlo module is integrated with our Theraplan PlusTM treatment planning system. An extensive commissioning process preceded clinical implementation of this software. Using a single virtual 'machine' for each electron beam energy, we can now calculate very accurately the dose distributions and the number of MU for any arbitrary field shape and SSD. This new treatment planning capability has significantly impacted our clinical practice. Since we are more confident of the actual dose delivered to a patient, we now calculate accurate three-dimensional (3D) dose distributions for a greater variety of techniques and anatomical sites than we have in the past. We use the Monte Carlo module to calculate dose for head and neck, breast, chest wall and abdominal treatments with electron beams applied either solo or in conjunction with photons. In some cases patient treatment decisions have been changed, as compared to how such patients would have been treated in the past. In this paper, we present the planning procedure and some clinical examples

  14. Cluster-Event Biasing in Monte Carlo Applications to Systems Reliability

    International Nuclear Information System (INIS)

    Estimation of the probabilities of rare events with significant consequences, e.g., disasters, is one of the most difficult problems in Monte Carlo applications to systems engineering and reliability. The Bernoulli-type estimator used in analog Monte Carlo is characterized by extremely high variance when applied to the estimation of rare events. Variance reduction methods are, therefore, of importance in this field.The present work suggests a parametric nonanalog probability measure based on the superposition of transition biasing and forced events biasing. The cluster-event model is developed providing an effective and reliable approximation for the second moment and the benefit along with a methodology of selecting near-optimal biasing parameters. Numerical examples show a considerable benefit when the method is applied to problems of particular difficulty for the analog Monte Carlo method.The suggested model is applicable for reliability assessment of stochastic networks of complicated topology and high redundancy with component-level repair (i.e., repair applied to an individual failed component while the system is operational)

  15. Study on a re-verification detection system for IAEA safeguard to a consolidated spent fuel storage system

    International Nuclear Information System (INIS)

    A modular type storage system that has an enhanced storage density has been developed. It has storage density more than twice larger than silo type which stands alone. The storage module can accommodates 40 cylinders in 4 rows of 10, with 24 located close to periphery of the module and 16 located internally at some distance from the peripheral walls. As for IAEA Safeguards, the module is equipped with a comprehensive set of receptacles into which the IAEA can install seals and verification equipment. The Safeguards equipment, coupled with related measures for physical protection, will facilitate the timely detection of any diversion of significant quantities of nuclear material from the spent fuels stored in the modules. Nevertheless, due to its generic design feature it makes very difficult to provide re-verification of the loaded or under loaded cylinders inside module, along with an unattended monitoring system re-verification is an IAEA safeguard requirement to measure the gamma dose rate and spectrum of each irradiated fuel basket once the storage cylinders are loaded with spent fuel. The gamma profile is read by lowering a detector inside the tube so that it can be registered at the level of each basket. For the 24 peripheral storage cylinders this method of measurement is retained on the side wall of the module. However, an alternate method is required for the 16 internal dry fuel storage cylinders since they are located some distance from the module walls and thus surrounded by storage cylinders. (author)

  16. River Protection Project Integrated safety management system phase II verification report, volumes I and II - 8/19/99

    Energy Technology Data Exchange (ETDEWEB)

    SHOOP, D.S.

    1999-09-10

    The Department of Energy policy (DOE P 450.4) is that safety is integrated into all aspects of the management and operations of its facilities. In simple and straightforward terms, the Department will ''Do work safely.'' The purpose of this River Protection Project (RPP) Integrated Safety Management System (ISMS) Phase II Verification was to determine whether ISMS programs and processes are implemented within RFP to accomplish the goal of ''Do work safely.'' The goal of an implemented ISMS is to have a single integrated system that includes Environment, Safety, and Health (ES&H) requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and federal property over the RPP life cycle. The ISMS is comprised of the (1) described functions, components, processes, and interfaces (system map or blueprint) and (2) personnel who are executing those assigned roles and responsibilities to manage and control the ISMS. Therefore, this review evaluated both the ''paper'' and ''people'' aspects of the ISMS to ensure that the system is implemented within RPP. Richland Operations Office (RL) conducted an ISMS Phase I Verification of the TWRS from September 28-October 9, 1998. The resulting verification report recommended that TWRS-RL and the contractor proceed with Phase II of ISMS verification given that the concerns identified from the Phase I verification review are incorporated into the Phase II implementation plan.

  17. Proceedings 7th International Workshop on Automated Specification and Verification of Web Systems

    CERN Document Server

    Kovacs, Laura; Tiezzi, Francesco

    2011-01-01

    This volume contains the final and revised versions of the papers presented at the 7th International Workshop on Automated Specification and Verification of Web Systems (WWV 2011). The workshop was held in Reykjavik, Iceland, on June 9, 2011, as part of DisCoTec 2011. The aim of the WWV workshop series is to provide an interdisciplinary forum to facilitate the cross-fertilization and the advancement of hybrid methods that exploit concepts and tools drawn from Rule-based programming, Software engineering, Formal methods and Web-oriented research. Nowadays, indeed, many companies and institutions have diverted their Web sites into interactive, completely-automated, Web-based applications for, e.g., e-business, e-learning, e-government, and e-health. The increased complexity and the explosive growth of Web systems have made their design and implementation a challenging task. Systematic, formal approaches to their specification and verification can permit to address the problems of this specific domain by means o...

  18. Determination of Pavement Elevations by the 3D Scanning System and Its Verification

    Directory of Open Access Journals (Sweden)

    Tomáš Křemen

    2014-06-01

    Full Text Available It is necessary to be careful of geometric accuracy of the roadways when constructing them.Correct thickness of the individual construction layers together with roughness of thepavement belongs among important influences ensuring lifetime of the roadways andvehicles and for comfortable and safe car ride. It is necessary beside other things to havea reliable check measurement method at disposal so as to ensure the required accuracy ofthe individual construction layers will be achieved. The check measurement method mustbe able to measure a checked construction component with the required accuracy and withsufficiently high density describing not only global deviations, but also local deviations.The highest requirements on accuracy are placed on the final construction layer of theroadway. Layer thickness and pavement roughness are being evaluated here. The 3Dterrestrial scanning method is currently offered for geometric checking of its realization.The article deals with testing of procedure of the pavement roughness measurement withthe 3D terrestrial scanning system and with its verification by a total station measurement.Emphasis is put on verification of accuracy of absolute heights of points in the 3D modelof the pavement and on size of random errors in the elevation component. Results of thetesting clarified using the 3D terrestrial scanning systems and their accuracy for check ofthe roadway surface.

  19. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  20. A Multitier System for the Verification, Visualization and Management of CHIMERA

    International Nuclear Information System (INIS)

    CHIMERA is a multi-dimensional radiation hydrodynamics code designed to study core-collapse supernovae. The code is made up of three essentially independent parts: a hydrodynamics module, a nuclear burning module, and a neutrino transport solver combined within an operator-split approach. Given CHIMERA's complexity and pace of ongoing development, a new support system, Bellerophon, has been designed and implemented to perform automated verification, visualization and management tasks while integrating with other workflow systems utilized by CHIMERA's development group. In order to achieve these goals, a multitier approach has been adopted. By integrating supercomputing platforms, visualization clusters, a dedicated web server and a client-side desktop application, this system attempts to provide an encapsulated, end-to-end solution to these needs.

  1. Interim report on the performance test results for the Advanced Verification for Inventory sample System (AVIS)

    International Nuclear Information System (INIS)

    The AVIS (Advanced Verification for Inventory sample System) is a high accuracy nondestructive assay (NDA) system for Safeguards to measure the plutonium mass for plutonium uranium mixed oxide (MOX) powder and pellet for large scale MOX fuel fabrication plant (J-MOX). It is intended that the AVIS measurement will be substituted for a fraction of the DA samples to reduce the number of DA from J-MOX. Therefore, the AVIS has a crucial role to attain effective safeguards for J-MOX. JAEA, which has experience/knowledge to develop the NDA system and plutonium handling field, was entrusted with performance testing for the AVIS from NMCC. JAEA has conducted performance testing on the AVIS using standard radiation source (neutron and gamma). As the results of test, it was confirmed that the AVIS has the designed performance. This paper reports the performance test results and evaluation results of measurement accuracy of the AVIS. (author)

  2. Verification of Monte Carlo Calculations by Means of Neutron and Gamma Fluence Spectra Measurements behind and inside of Iron-Water Configurations

    International Nuclear Information System (INIS)

    Neutron and gamma spectra were measured behind and inside of modules consisting of variable iron and water slabs that were installed in radial beams of the zero-power training and research reactors AKR of the Technical University Dresden and ZLFR of the University of Applied Sciences Zittau/Goerlitz. The applied NE-213 scintillation spectrometer did allow the measurement of gamma and neutron fluence spectra in the energy regions 0.3-10 MeV for photons and 1.0-20 MeV for neutrons. The paper describes the experiments and presents important results of the measurements. They are compared with the results of Monte Carlo transport calculations made by means of the codes MCNP and TRAMO on an absolute scale of fluences

  3. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  4. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    International Nuclear Information System (INIS)

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  5. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    Energy Technology Data Exchange (ETDEWEB)

    Penchev, Petar; Maeder, Ulf; Fiebich, Martin [IMPS University of Applied Sciences, Giessen (Germany). Inst. of Medical Physics and Radiation Protection; Zink, Klemens [IMPS University of Applied Sciences, Giessen (Germany). Inst. of Medical Physics and Radiation Protection; University Hospital Marburg (Germany). Dept. of Radiotherapy and Oncology

    2015-07-01

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  6. Dose perturbation in the presence of metallic implants: treatment planning system versus Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wieslander, Elinore; Knoeoes, Tommy [Radiation Physics, Lund University Hospital, SE-221 85 Lund (Sweden)

    2003-10-21

    An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox(a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants.

  7. Dose perturbation in the presence of metallic implants: treatment planning system versus Monte Carlo simulations

    International Nuclear Information System (INIS)

    An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox(a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants

  8. CARMEN: a system Monte Carlo based on linear programming from direct openings

    International Nuclear Information System (INIS)

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  9. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, H.; Nagata, T.; Yamada, M. [Nuclear Power Engineering Corp. (Japan); Kasahara, K.; Tsuruta, T.; Nishimura, T. [Mitsubishi Heavy Industries, Ltd. (Japan); Ishigure, K. [Saitama Inst. of Tech. (Japan)

    2002-07-01

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  10. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    International Nuclear Information System (INIS)

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  11. An independent system for real-time dynamic multileaf collimation trajectory verification using EPID

    International Nuclear Information System (INIS)

    A new tool has been developed to verify the trajectory of dynamic multileaf collimators (MLCs) used in advanced radiotherapy techniques using only the information provided by the electronic portal imaging devices (EPID) measured image frames. The prescribed leaf positions are resampled to a higher resolution in a pre-processing stage to improve the verification precision. Measured MLC positions are extracted from the EPID frames using a template matching method. A cosine similarity metric is then applied to synchronise measured and planned leaf positions for comparison. Three additional comparison functions were incorporated to ensure robust synchronisation. The MLC leaf trajectory error detection was simulated for both intensity modulated radiation therapy (IMRT) (prostate) and volumetric modulated arc therapy (VMAT) (head-and-neck) deliveries with anthropomorphic phantoms in the beam. The overall accuracy for MLC positions automatically extracted from EPID image frames was approximately 0.5 mm. The MLC leaf trajectory verification system can detect leaf position errors during IMRT and VMAT with a tolerance of 3.5 mm within 1 s. (paper)

  12. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, C. LEE COOK DIVISION, DOVER CORPORATION, STATIC PAC (TM) SYSTEM, PHASE II REPORT

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Static Pac System, Phase II, natural gas reciprocating compressor rod packing manufactured by the C. Lee Cook Division, Dover Corporation. The Static Pac System is designed to seal th...

  14. Reverse Monte Carlo ray-tracing for radiative heat transfer in combustion systems

    Science.gov (United States)

    Sun, Xiaojing

    Radiative heat transfer is a dominant heat transfer phenomenon in high temperature systems. With the rapid development of massive supercomputers, the Monte-Carlo ray tracing (MCRT) method starts to see its applications in combustion systems. This research is to find out if Monte-Carlo ray tracing can offer more accurate and efficient calculations than the discrete ordinates method (DOM). Monte-Carlo ray tracing method is a statistical method that traces the history of a bundle of rays. It is known as solving radiative heat transfer with almost no approximation. It can handle nonisotropic scattering and nongray gas mixtures with relative ease compared to conventional methods, such as DOM and spherical harmonics method, etc. There are two schemes in Monte-Carlo ray tracing method: forward and backward/reverse. Case studies and the governing equations demonstrate the advantages of reverse Monte-Carlo ray tracing (RMCRT) method. The RMCRT can be easily implemented for domain decomposition parallelism. In this dissertation, different efficiency improvements techniques for RMCRT are introduced and implemented. They are the random number generator, stratified sampling, ray-surface intersection calculation, Russian roulette, and important sampling. There are two major modules in solving the radiative heat transfer problems: the RMCRT RTE solver and the optical property models. RMCRT is first fully verified in gray, scattering, absorbing and emitting media with black/nonblack, diffuse/nondiffuse bounded surface problems. Sensitivity analysis is carried out with regard to the ray numbers, the mesh resolutions of the computational domain, optical thickness of the media and effects of variance reduction techniques (stratified sampling, Russian roulette). Results are compared with either analytical solutions or benchmark results. The efficiency (the product of error and computation time) of RMCRT has been compared to DOM and suggest great potential for RMCRT's application

  15. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  16. Remaining Sites Verification Package for the 1607-F3 Sanitary Sewer System, Waste Site Reclassification Form 2006-047

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-04-26

    The 1607-F3 waste site is the former location of the sanitary sewer system that supported the 182-F Pump Station, the 183-F Water Treatment Plant, and the 151-F Substation. The sanitary sewer system included a septic tank, drain field, and associated pipeline, all in use between 1944 and 1965. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  17. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  18. Monte Carlo Application ToolKit (MCATK)

    International Nuclear Information System (INIS)

    Highlights: • Component-based Monte Carlo radiation transport parallel software library. • Designed to build specialized software applications. • Provides new functionality for existing general purpose Monte Carlo transport codes. • Time-independent and time-dependent algorithms with population control. • Algorithm verification and validation results are provided. - Abstract: The Monte Carlo Application ToolKit (MCATK) is a component-based software library designed to build specialized applications and to provide new functionality for existing general purpose Monte Carlo radiation transport codes. We will describe MCATK and its capabilities along with presenting some verification and validations results

  19. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    OpenAIRE

    Tuo Ming Fu; Zhou Xing She; Guo Zheng Xin; Shan Li Jun

    2016-01-01

    The safety of Cyber-physical system(CPS) is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL). The formal definition of hybrid program(HP) is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the...

  20. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.