WorldWideScience

Sample records for carlo verification system

  1. An integrated Monte Carlo dosimetric verification system for radiotherapy treatment planning

    Science.gov (United States)

    Yamamoto, T.; Mizowaki, T.; Miyabe, Y.; Takegawa, H.; Narita, Y.; Yano, S.; Nagata, Y.; Teshima, T.; Hiraoka, M.

    2007-04-01

    An integrated Monte Carlo (MC) dose calculation system, MCRTV (Monte Carlo for radiotherapy treatment plan verification), has been developed for clinical treatment plan verification, especially for routine quality assurance (QA) of intensity-modulated radiotherapy (IMRT) plans. The MCRTV system consists of the EGS4/PRESTA MC codes originally written for particle transport through the accelerator, the multileaf collimator (MLC), and the patient/phantom, which run on a 28-CPU Linux cluster, and the associated software developed for the clinical implementation. MCRTV has an interface with a commercial treatment planning system (TPS) (Eclipse, Varian Medical Systems, Palo Alto, CA, USA) and reads the information needed for MC computation transferred in DICOM-RT format. The key features of MCRTV have been presented in detail in this paper. The phase-space data of our 15 MV photon beam from a Varian Clinac 2300C/D have been developed and several benchmarks have been performed under homogeneous and several inhomogeneous conditions (including water, aluminium, lung and bone media). The MC results agreed with the ionization chamber measurements to within 1% and 2% for homogeneous and inhomogeneous conditions, respectively. The MC calculation for a clinical prostate IMRT treatment plan validated the implementation of the beams and the patient/phantom configuration in MCRTV.

  2. Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system

    Science.gov (United States)

    Ma, C.-M.; Pawlicki, T.; Jiang, S. B.; Li, J. S.; Deng, J.; Mok, E.; Kapur, A.; Xing, L.; Ma, L.; Boyer, A. L.

    2000-09-01

    The purpose of this work was to use Monte Carlo simulations to verify the accuracy of the dose distributions from a commercial treatment planning optimization system (Corvus, Nomos Corp., Sewickley, PA) for intensity-modulated radiotherapy (IMRT). A Monte Carlo treatment planning system has been implemented clinically to improve and verify the accuracy of radiotherapy dose calculations. Further modifications to the system were made to compute the dose in a patient for multiple fixed-gantry IMRT fields. The dose distributions in the experimental phantoms and in the patients were calculated and used to verify the optimized treatment plans generated by the Corvus system. The Monte Carlo calculated IMRT dose distributions agreed with the measurements to within 2% of the maximum dose for all the beam energies and field sizes for both the homogeneous and heterogeneous phantoms. The dose distributions predicted by the Corvus system, which employs a finite-size pencil beam (FSPB) algorithm, agreed with the Monte Carlo simulations and measurements to within 4% in a cylindrical water phantom with various hypothetical target shapes. Discrepancies of more than 5% (relative to the prescribed target dose) in the target region and over 20% in the critical structures were found in some IMRT patient calculations. The FSPB algorithm as implemented in the Corvus system is adequate for homogeneous phantoms (such as prostate) but may result in significant under- or over-estimation of the dose in some cases involving heterogeneities such as the air-tissue, lung-tissue and tissue-bone interfaces.

  3. Development of a Monte Carlo model for treatment planning dose verification of the Leksell Gamma Knife Perfexion radiosurgery system.

    Science.gov (United States)

    Yuan, Jiankui; Lo, Simon S; Zheng, Yiran; Sohn, Jason W; Sloan, Andrew E; Ellis, Rodney; Machtay, Mitchell; Wessels, Barry

    2016-01-01

    Detailed Monte Carlo (MC) modeling of the Leksell Gamma Knife (GK) Perfexion (PFX) collimator system is the only accurate ab initio approach appearing in the literature. As a different approach, in this work, we present a MC model based on film measurement. By adjusting the model parameters and fine-tuning the derived fluence map for each individual source to match the manufacturer's ring output factors, we created a reasonable virtual source model for MC simulations to verify treatment planning dose for the GK PFX radiosurgery system. The MC simulation model was commissioned by simple single shots. Dose profiles and both ring and collimator output factors were compared with the treatment planning system (TPS). Good agreement was achieved for dose profiles especially for the region of plateau (< 2%), while larger difference (< 5%) came from the penumbra region. The maximum difference of the calculated output factor was within 0.7%. The model was further validated by a clinical test case. Good agreement was obtained. The DVHs for brainstem and the skull were almost identical and, for the target, the volume covered by the prescription (12.5 Gy to 50% isodose line) was 95.6% from MC calculation versus 100% from the TPS. PMID:27455497

  4. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  5. Verification of Monte Carlo transport codes FLUKA, Mars and Shield

    International Nuclear Information System (INIS)

    The present study is a continuation of the project 'Verification of Monte Carlo Transport Codes' which is running at GSI as a part of activation studies of FAIR relevant materials. It includes two parts: verification of stopping modules of FLUKA, MARS and SHIELD-A (with ATIMA stopping module) and verification of their isotope production modules. The first part is based on the measurements of energy deposition function of uranium ions in copper and stainless steel. The irradiation was done at 500 MeV/u and 950 MeV/u, the experiment was held at GSI from September 2004 until May 2005. The second part is based on gamma-activation studies of an aluminium target irradiated with an argon beam of 500 MeV/u in August 2009. Experimental depth profiling of the residual activity of the target is compared with the simulations. (authors)

  6. Simulation of digital pixel readout chip architectures with the RD53 SystemVerilog-UVM verification environment using Monte Carlo physics data

    International Nuclear Information System (INIS)

    The simulation and verification framework developed by the RD53 collaboration is a powerful tool for global architecture optimization and design verification of next generation hybrid pixel readout chips. In this paper the framework is used for studying digital pixel chip architectures at behavioral level. This is carried out by simulating a dedicated, highly parameterized pixel chip description, which makes it possible to investigate different grouping strategies between pixels and different latency buffering and arbitration schemes. The pixel hit information used as simulation input can be either generated internally in the framework or imported from external Monte Carlo detector simulation data. The latter have been provided by both the CMS and ATLAS experiments, featuring HL-LHC operating conditions and the specifications related to the Phase 2 upgrade. Pixel regions and double columns were simulated using such Monte Carlo data as inputs: the performance of different latency buffering architectures was compared and the compliance of different link speeds with the expected column data rate was verified

  7. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy.

    Science.gov (United States)

    Lima, Thiago V M; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient's treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers' measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta threshold

  8. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy

    Science.gov (United States)

    Lima, Thiago V. M.; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient’s treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers’ measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference – p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta

  9. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  10. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  11. Distorted Fingerprint Verification System

    OpenAIRE

    Divya KARTHIKAESHWARAN; Jeyalatha SIVARAMAKRISHNAN

    2011-01-01

    Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the...

  12. Generalized coordinate transformations for Monte Carlo (DOSXYZnrc and VMC++) verifications of DICOM compatible radiotherapy treatment plans

    CERN Document Server

    Schmitz, Richard M; Townson, Reid W; Zavgorodni, Sergei

    2014-01-01

    The International Electrotechnical Commission (IEC) has previously defined standard rotation operators for positive gantry, collimator and couch rotations for the radiotherapy DICOM coordinate system that is commonly used by treatment planning systems. Coordinate transformations to the coordinate systems of commonly used Monte Carlo (MC) codes (BEAMnrc/DOSXYZnrc and VMC++) have been derived and published in the literature. However, these coordinate transformations disregard patient orientation during the computed tomography (CT) scan, and assume the most commonly used 'head first, supine' orientation. While less common, other patient orientations are used in clinics - Monte Carlo verification of such treatments can be problematic due to the lack of appropriate coordinate transformations. In this work, a solution has been obtained by correcting the CT-derived phantom orientation and deriving generalized coordinate transformations for field angles in the DOSXYZnrc and VMC++ codes. The rotation operator that inc...

  13. Pre-treatment radiotherapy dose verification using Monte Carlo doselet modulation in a spherical phantom

    CERN Document Server

    Townson, Reid W

    2013-01-01

    Due to the increasing complexity of radiotherapy delivery, accurate dose verification has become an essential part of the clinical treatment process. The purpose of this work was to develop an electronic portal image (EPI) based pre-treatment verification technique capable of quickly reconstructing 3D dose distributions from both coplanar and non-coplanar treatments. The dose reconstruction is performed in a spherical water phantom by modulating, based on EPID measurements, pre-calculated Monte Carlo (MC) doselets defined on a spherical coordinate system. This is called the spherical doselet modulation (SDM) method. This technique essentially eliminates the statistical uncertainty of the MC dose calculations by exploiting both azimuthal symmetry in a patient-independent phase-space and symmetry of a virtual spherical water phantom. The symmetry also allows the number of doselets necessary for dose reconstruction to be reduced by a factor of about 250. In this work, 51 doselets were used. The SDM method mitiga...

  14. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  15. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  16. SU-E-T-578: MCEBRT, A Monte Carlo Code for External Beam Treatment Plan Verifications

    Energy Technology Data Exchange (ETDEWEB)

    Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States); Eldib, A [Fox Chase Cancer Center, Philadelphia, PA (United States); Al-Azhar University, Cairo (Egypt)

    2014-06-01

    Purpose: Present a new Monte Carlo code (MCEBRT) for patient-specific dose calculations in external beam radiotherapy. The code MLC model is benchmarked and real patient plans are re-calculated using MCEBRT and compared with commercial TPS. Methods: MCEBRT is based on the GEPTS system (Med. Phys. 29 (2002) 835–846). Phase space data generated for Varian linac photon beams (6 – 15 MV) are used as source term. MCEBRT uses a realistic MLC model (tongue and groove, rounded ends). Patient CT and DICOM RT files are used to generate a 3D patient phantom and simulate the treatment configuration (gantry, collimator and couch angles; jaw positions; MLC sequences; MUs). MCEBRT dose distributions and DVHs are compared with those from TPS in absolute way (Gy). Results: Calculations based on the developed MLC model closely matches transmission measurements (pin-point ionization chamber at selected positions and film for lateral dose profile). See Fig.1. Dose calculations for two clinical cases (whole brain irradiation with opposed beams and lung case with eight fields) are carried out and outcomes are compared with the Eclipse AAA algorithm. Good agreement is observed for the brain case (Figs 2-3) except at the surface where MCEBRT dose can be higher by 20%. This is due to better modeling of electron contamination by MCEBRT. For the lung case an overall good agreement (91% gamma index passing rate with 3%/3mm DTA criterion) is observed (Fig.4) but dose in lung can be over-estimated by up to 10% by AAA (Fig.5). CTV and PTV DVHs from TPS and MCEBRT are nevertheless close (Fig.6). Conclusion: A new Monte Carlo code is developed for plan verification. Contrary to phantombased QA measurements, MCEBRT simulate the exact patient geometry and tissue composition. MCEBRT can be used as extra verification layer for plans where surface dose and tissue heterogeneity are an issue.

  17. Enumeration Verification System (EVS)

    Data.gov (United States)

    Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...

  18. Central Verification System

    Data.gov (United States)

    US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...

  19. Simple dose verification system for radiotherapy radiation

    International Nuclear Information System (INIS)

    The aim of this paper is to investigate an accurate and convenient quality assurance programme that should be included in the dosimetry system of the radiotherapy level radiation. We designed a mailed solid phantom and used TLD-100 chips and a Rexon UL320 reader for the purpose of dosimetry quality assurance in Taiwanese radiotherapy centers. After being assembled, the solid polystyrene phantom weighted only 375 g which was suitable for mailing. The Monte Carlo BEAMnrc code was applied in calculations of the dose conversion factor of water and polystyrene phantom: the dose conversion factor measurements were obtained by switching the TLDs at the same calibration depth of water and the solid phantom to measure the absorbed dose and verify the accuracy of the theoretical calculation results. The experimental results showed that the dose conversion factors from TLD measurements and the calculation values from the BEAMnrc were in good agreement with a difference within 0.5%. Ten radiotherapy centers were instructed to deliver to the TLDs on central beam axis absorbed dose of 2 Gy. The measured doses were compared with the planned ones. A total of 21 beams were checked. The dose verification differences under reference conditions for 60Co, high energy X-rays of 6, 10 and 15 MV were truly within 4% and that proved the feasibility of applying the method suggested in this work in radiotherapy dose verification

  20. A Correlation-Based Fingerprint Verification System

    NARCIS (Netherlands)

    Bazen, Asker M.; Verwaaijen, Gerben T.B.; Gerez, Sabih H.; Veelenturf, Leo P.J.; Zwaag, van der Berend Jan

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates i

  1. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow...... is developed based on sum of squares programming. In addition, a necessary and sufficient condition is provided for identifying the subdivisioning functions that allow the generation of complete abstractions. A complete abstraction makes it possible to verify and falsify timed temporal properties of continuous...

  2. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Mustafin, Edil; Strasik, Ivan [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Ratzinger, Ulrich [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); Latysheva, Ludmila; Sobolevskiy, Nikolai [Institute for Nuclear Research RAS, Moscow (Russian Federation)

    2011-07-01

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  3. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    International Nuclear Information System (INIS)

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  4. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  5. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  6. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  7. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  8. On the verification of polynomial system solvers

    Institute of Scientific and Technical Information of China (English)

    Changbo CHEN; Marc MORENO MAZA; Wei PAN; Yuzhen XI

    2008-01-01

    We discuss the verification of mathematical software solving polynomial systems symbolically by way of triangular decomposition. Standard verification techniques are highly resource consuming and apply only to polynomial systems which are easy to solve. We exhibit a new approach which manipulates constructible sets represented by regular systems. We provide comparative benchmarks of different verification procedures applied to four solvers on a large set of well-known polynomial systems. Our experimental results illustrate the high effi-ciency of our new approach. In particular, we are able to verify triangular decompositions of polynomial systems which are not easy to solve.

  9. XSBench. The development and verification of a performance abstraction for Monte Carlo reactor analysis

    International Nuclear Information System (INIS)

    We isolate the most computationally expensive steps of a robust nuclear reactor core Monte Carlo particle transport simulation. The hot kernel is then abstracted into a simplified proxy application, designed to mimic the key performance characteristics of the full application. A series of performance verification tests and analyses are carried out to investigate the low-level performance parameters of both the simplified kernel and the full application. The kernel's performance profile is found to closely match that of the application, making it a convenient test bed for performance analyses on cutting edge platforms and experimental next-generation high performance computing architectures. (author)

  10. Code Formal Verification of Operation System

    OpenAIRE

    Yu Zhang; Yunwei Dong; Huo Hong; Fan Zhang

    2010-01-01

    with the increasing pressure on non-function attributes (security, safety and reliability) requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operatio...

  11. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...

  12. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  13. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  14. Code Formal Verification of Operation System

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2010-12-01

    Full Text Available with the increasing pressure on non-function attributes (security, safety and reliability requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operation system kernel in C code level. We present a case study to the verification of real-world C systems code derived from an implementation of μC/OS – II in the end.

  15. Verification of Monte Carlo calculations of the neutron flux in typical irradiation channels of the TRIGA reactor, Ljubljana

    NARCIS (Netherlands)

    Jacimovic, R; Maucec, M; Trkov, A

    2003-01-01

    An experimental verification of Monte Carlo neutron flux calculations in typical irradiation channels in the TRIGA Mark II reactor at the Jozef Stefan Institute is presented. It was found that the flux, as well as its spectral characteristics, depends rather strongly on the position of the irradiati

  16. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...

  17. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  18. Verification and Validation of MERCURY: A Modern, Monte Carlo Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Procassini, R J; Cullen, D E; Greenman, G M; Hagmann, C A

    2004-12-09

    Verification and Validation (V&V) is a critical phase in the development cycle of any scientific code. The aim of the V&V process is to determine whether or not the code fulfills and complies with the requirements that were defined prior to the start of the development process. While code V&V can take many forms, this paper concentrates on validation of the results obtained from a modern code against those produced by a validated, legacy code. In particular, the neutron transport capabilities of the modern Monte Carlo code MERCURY are validated against those in the legacy Monte Carlo code TART. The results from each code are compared for a series of basic transport and criticality calculations which are designed to check a variety of code modules. These include the definition of the problem geometry, particle tracking, collisional kinematics, sampling of secondary particle distributions, and nuclear data. The metrics that form the basis for comparison of the codes include both integral quantities and particle spectra. The use of integral results, such as eigenvalues obtained from criticality calculations, is shown to be necessary, but not sufficient, for a comprehensive validation of the code. This process has uncovered problems in both the transport code and the nuclear data processing codes which have since been rectified.

  19. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  20. Verification of Autonomous Systems for Space Applications

    Science.gov (United States)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  1. Evaluating software verification systems: benchmarks and competitions

    NARCIS (Netherlands)

    Beyer, Dirk; Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary

    2014-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14171 “Evaluating Software Verification Systems: Benchmarks and Competitions”. The seminar brought together a large group of current and future competition organizers and participants, benchmark maintainers, as well as practition

  2. Range verification methods in particle therapy: underlying physics and Monte Carlo modelling

    Directory of Open Access Journals (Sweden)

    Aafke Christine Kraan

    2015-07-01

    Full Text Available Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients.Non-invasive in-vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including beta+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC predictions is a key issue. Correctly modelling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modelling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  3. National Verification System of National Meteorological Center , China

    Science.gov (United States)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  4. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  5. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  6. Automated Formal Verification for PLC Control Systems

    CERN Document Server

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  7. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  8. Parametric Verification of Weighted Systems

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders;

    2015-01-01

    This paper addresses the problem of parametric model checking for weighted transition systems. We consider transition systems labelled with linear equations over a set of parameters and we use them to provide semantics for a parametric version of weighted CTL where the until and next operators...... are themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose...... a global update function that yields an assignment to each node in a PDG. For an iterative application of the function, we prove that a fixed point assignment to PDG nodes exists and the set of assignments constitutes a well-quasi ordering, thus ensuring that the fixed point assignment can be found after...

  9. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  10. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  11. Formal Verification of Self-Assembling Systems

    CERN Document Server

    Sterling, Aaron

    2010-01-01

    This paper introduces the theory and practice of formal verification of self-assembling systems. We interpret a well-studied abstraction of nanomolecular self assembly, the Abstract Tile Assembly Model (aTAM), into Computation Tree Logic (CTL), a temporal logic often used in model checking. We then consider the class of "rectilinear" tile assembly systems. This class includes most aTAM systems studied in the theoretical literature, and all (algorithmic) DNA tile self-assembling systems that have been realized in laboratories to date. We present a polynomial-time algorithm that, given a tile assembly system T as input, either provides a counterexample to T's rectilinearity or verifies whether T has a unique terminal assembly. Using partial order reductions, the verification search space for this algorithm is reduced from exponential size to O(n^2), where n x n is the size of the assembly surface. That reduction is asymptotically the best possible. We report on experimental results obtained by translating tile ...

  12. Verification of Three Dimensional Triangular Prismatic Discrete Ordinates Transport Code ENSEMBLE-TRIZ by Comparison with Monte Carlo Code GMVP

    Science.gov (United States)

    Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi

    2014-06-01

    This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.

  13. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  14. Verification and Validation of Flight Critical Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  15. A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification

    Science.gov (United States)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.

  16. Verification and validation of control system software

    International Nuclear Information System (INIS)

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  17. Skilled Impostor Attacks Against Fingerprint Verification Systems And Its Remedy

    OpenAIRE

    Gottschlich, Carsten

    2015-01-01

    Fingerprint verification systems are becoming ubiquitous in everyday life. This trend is propelled especially by the proliferation of mobile devices with fingerprint sensors such as smartphones and tablet computers, and fingerprint verification is increasingly applied for authenticating financial transactions. In this study we describe a novel attack vector against fingerprint verification systems which we coin skilled impostor attack. We show that existing protocols for performance evaluatio...

  18. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  19. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  20. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    Energy Technology Data Exchange (ETDEWEB)

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  1. Integrated safety management system verification: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-10

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalization of an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR, 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System (ISMS). Guidance and expectations have been provided to PNNL by incorporation into the operating contract (Contract DE-ACM-76FL0 1830) and by letter. The contract requires that the contractor submit a description of their ISMS for approval by DOE. PNNL submitted their proposed Safety Management System Description for approval on November 25,1997. RL tentatively approved acceptance of the description pursuant to a favorable recommendation from this review. The Integrated Safety Management System Verification is a review of the adequacy of the ISMS description in fulfilling the requirements of the DEAR and the DOE Policy. The purpose of this review is to provide the Richland Operations Office Manager with a recommendation for approval of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and to verify the extent and maturity of ISMS implementation within the Laboratory. Further the review will provide a model for other DOE laboratories managed by the Office of Assistant Secretary for Energy Research.

  2. Integrated safety management system verification: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-12

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System. The Manager, Richland Operations Office (RL), initiated a combined Phase 1 and Phase 2 Integrated Safety Management Verification review to confirm that PNNL had successfully submitted a description of their ISMS and had implemented ISMS within the laboratory facilities and processes. A combined review was directed by the Manager, RL, based upon the progress PNNL had made in the implementation of ISM. This report documents the results of the review conducted to verify: (1) that the PNNL integrated safety management system description and enabling documents and processes conform to the guidance provided by the Manager, RL; (2) that corporate policy is implemented by line managers; (3) that PNNL has provided tailored direction to the facility management; and (4) the Manager, RL, has documented processes that integrate their safety activities and oversight with those of PNNL. The general conduct of the review was consistent with the direction provided by the Under Secretary`s Draft Safety Management System Review and Approval Protocol. The purpose of this review was to provide the Manager, RL, with a recommendation to the adequacy of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and, to provide an evaluation of the extent and maturity of ISMS implementation within the Laboratory. Further, this review was intended to provide a model for other DOE Laboratories. In an effort to reduce the time and travel costs associated with ISM verification the team agreed to conduct preliminary training and orientation electronically and by phone. These

  3. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  4. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  5. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  6. Simulation Monte Carlo as a method of verification of the characterization of fountains in ophthalmic brachytherapy; Simulacion Monte Carlo como metodo de verificacion de la caracterizacion de fuentes en braquiterapia oftalmica

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz Lora, A.; Miras del Rio, H.; Terron Leon, J. A.

    2013-07-01

    Following the recommendations of the IAEA, and as a further check, they have been Monte Carlo simulation of each one of the plates that are arranged at the Hospital. The objective of the work is the verification of the certificates of calibration and intends to establish criteria of action for its acceptance. (Author)

  7. Development of Palmprint Verification System Using Biometrics

    Institute of Scientific and Technical Information of China (English)

    G. Shobha; M. Krishna; S.C. Sharma

    2006-01-01

    Palmprint verification system using Biometrics is one of the emerging technologies, which recognizes a person based on the principle lines, wrinkles and ridges on the surface of the palm. These line structures are stable and remain unchanged throughout the life of an individual. More importantly, no two palmprints from different individuals are the same, and normally people do not feel uneasy to have their palmprint images taken for testing. Therefore palmprint recognition offers a promising future for medium-security access control systems. In this paper, a new approach for personal authentication using hand images is discussed. Gray-Scale palm images are captured using a digital camera at a resolution of 640′480. Each of these gray-scale images is aligned and then used to extract palmprint and hand geometry features. These features are then used for authenticating users. The image acquisition setup used here is inherently simple and it does not employ any special illumination nor does it use any pegs that might cause any inconvenience to users. Experimental results show that the designed system achieves an acceptable level of performance.

  8. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  9. Verification of Transformer Restricted Earth Fault Protection by using the Monte Carlo Method

    OpenAIRE

    KRSTIVOJEVIC, J. P.; DJURIC, M. B.

    2015-01-01

    The results of a comprehensive investigation of the influence of current transformer (CT) saturation on restricted earth fault (REF) protection during power transformer magnetization inrush are presented. Since the inrush current during switch-on of unloaded power transformer is stochastic, its values are obtained by: (i) laboratory measurements and (ii) calculations based on the input data obtained by the Monte Carlo (MC) simulation. To make a detailed assessment of the curre...

  10. Formal verification of safety protocol in train control system

    OpenAIRE

    Zhang, Yan; TANG, TAO; Li, Keping; Mera Sanchez de Pedro, Jose Manuel; Zhu, Li; Zhao, Lin; Xu, Tianhua

    2011-01-01

    In order to satisfy the safety-critical requirements, the train control system (TCS) often employs a layered safety communication protocol to provide reliable services. However, both description and verification of the safety protocols may be formidable due to the system complexity. In this paper, interface automata (IA) are used to describe the safety service interface behaviors of safety communication protocol. A formal verification method is proposed to describe the safety communication pr...

  11. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  12. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  13. Environmental radiation measurement in CTBT verification system

    International Nuclear Information System (INIS)

    This paper introduces the technical requirements of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Radionuclide Stations, the CTBT-related activities carried out by the Japan Atomic Energy Research Institute (JAERI), and the ripple effects of such acquired radionuclide data on general researches. The International Monitoring System (IMS), which is one of the CTBT verification regime. Consists of 80 radionuclide air monitoring stations (of those, 40 stations monitor noble gas as well) and 16 certified laboratories that support these stations throughout the world. For radionuclide air monitoring under the CTBT, the stations collect particulates in the atmosphere on a filter and determine by gamma-ray spectrometry the presence or absence of any radionuclides (e.g. 140Ba, 131I, 99Mo, 132Te, 103Ru, 141Ce, 147Nd, 95Zr, etc.) that offer clear evidence of possible nuclear explosion. Minimum technical requirements are stringently set for the radionuclide air monitoring stations: 500 m3/h air flow rate, 24-hour acquisition time, 10 to 30 Bq/m3 of detection sensitivity for 140Ba, and less than 7 consecutive days, or total of 15 days, a year of shutdown at the stations. For noble gas monitoring, on the other hand, the stations separate Xe from gas elements in the atmosphere and, after purifying and concentrating it, measure 4 nuclides, 131mXe, 133Xe, 133mXe, and 135Xe, by gamma-ray spectrometry or beta-gamma coincidence method. Minimum technical requirements are also set for the noble gas measurement: 0.4 m3/h air flow rate, a full capacity of 10 m3, and 1 Bq/m3 of detection sensitivity for 133Xe, etc. On the request of the Ministry of Education, Culture, Sports and Technology, the JAERI is currently undertaking the establishment of the CTBT radionuclide monitoring stations at both Takasaki (both particle and noble gas) and Okinawa (particle), the certified laboratory at JAERI Tokai, and the National Data Center (NDC 2) at JAERI Tokai, which handles radionuclide data, as

  14. Verification measurements and clinical evaluation of the iPlan RT Monte Carlo dose algorithm for 6 MV photon energy

    Science.gov (United States)

    Petoukhova, A. L.; van Wingerden, K.; Wiggenraad, R. G. J.; van de Vaart, P. J. M.; van Egmond, J.; Franken, E. M.; van Santvoort, J. P. C.

    2010-08-01

    This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.

  15. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    CERN Document Server

    Locke, C

    2008-01-01

    Monte Carlo (MC) method provides the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations to treatment planning quality assurance process. This process involves MC dose calculations for the treatment plans produced clinically. To perform these calculations a number of treatment plan parameters specifying radiation beam and patient geometries needs to be transferred to MC codes such as BEAMnrc and DOSXYZnrc. Extracting these parameters from DICOM files is not a trivial task that has previously been performed mostly using Matlab-based software. This paper describes DICOM tags that contain information required for MC modeling of conformal and IMRT plans, and reports development of an in-house DICOM interface through a library (named Vega) of platform-independent, object-oriented C++ codes. Vega library is small and succinct, offering just the fundamental functions for reading/modifying/writing DICOM files in a ...

  16. Modular Verification of Interactive Systems with an Application to Biology

    Directory of Open Access Journals (Sweden)

    P. Milazzo

    2011-01-01

    Full Text Available We propose sync-programs, an automata-based formalism for the description of biological systems, and a modular verification technique for such a formalism that allows properties expressed in the universal fragment of CTL to be verified on suitably chosen fragments of models, rather than on whole models. As an application we show the modelling of the lac operon regulation process and the modular verification of some properties. Verification of properties is performed by using the NuSMV model checker and we show that by applying our modular verification technique we can verify properties in shorter times than those necessary to verify the same properties in the whole model.

  17. Probabilistic verification of partially observable dynamical systems

    OpenAIRE

    Gyori, Benjamin M.; Paulin, Daniel; Palaniappan, Sucheendra K.

    2014-01-01

    The construction and formal verification of dynamical models is important in engineering, biology and other disciplines. We focus on non-linear models containing a set of parameters governing their dynamics. The value of these parameters is often unknown and not directly observable through measurements, which are themselves noisy. When treating parameters as random variables, one can constrain their distribution by conditioning on observations and thereby constructing a posterior probability ...

  18. Finger-print based human verification system

    OpenAIRE

    Klopčič, Uroš

    2009-01-01

    The diploma thesis presents an algorithm for verification based on fingerprints. In order to achieve a simple and modular design, the algorithm is divided into number of steps. As an input, the algorithm takes greyscale fingerprint images. First, segmentation is performed where the background is separated from the area which represents the fingerprint. This is followed by the calculation of orientation field of the fingerprint using the gradient method and local frequency estimation. Both val...

  19. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    can be combined and used for an efficient development and verification of new fail-safe systems. The expected result is a methodology for using domain-specific, formal languages, techniques and tools for more efficient development and verification of robust software for railway control systems......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...... signalling systems are going to be completely replaced with modern, computer based railway control systems based on the European standard ERTMS/ETCS [3, 4] by the Danish Signaling Programme [1]. The purpose of these systems is to control the railway traffic such that unsafe situations, like train collisions...

  20. Verification of a two-layer inverse Monte Carlo absorption model using multiple source-detector separation diffuse reflectance spectroscopy.

    Science.gov (United States)

    Sharma, Manu; Hennessy, Ricky; Markey, Mia K; Tunnell, James W

    2013-12-01

    A two-layer Monte Carlo lookup table-based inverse model is validated with two-layered phantoms across physiologically relevant optical property ranges. Reflectance data for source-detector separations of 370 μm and 740 μm were collected from these two-layered phantoms and top layer thickness, reduced scattering coefficient and the top and bottom layer absorption coefficients were extracted using the inverse model and compared to the known values. The results of the phantom verification show that this method is able to accurately extract top layer thickness and scattering when the top layer thickness ranges from 0 to 550 μm. In this range, top layer thicknesses were measured with an average error of 10% and the reduced scattering coefficient was measured with an average error of 15%. The accuracy of top and bottom layer absorption coefficient measurements was found to be highly dependent on top layer thickness, which agrees with physical expectation; however, within appropriate thickness ranges, the error for absorption properties varies from 12-25%. PMID:24466475

  1. Verification of Transformer Restricted Earth Fault Protection by using the Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    KRSTIVOJEVIC, J. P.

    2015-08-01

    Full Text Available The results of a comprehensive investigation of the influence of current transformer (CT saturation on restricted earth fault (REF protection during power transformer magnetization inrush are presented. Since the inrush current during switch-on of unloaded power transformer is stochastic, its values are obtained by: (i laboratory measurements and (ii calculations based on the input data obtained by the Monte Carlo (MC simulation. To make a detailed assessment of the current transformer performance the uncertain input data for the CT model were obtained by applying the MC method. In this way, different levels of remanent flux in CT core are taken into consideration. By the generated CT secondary currents, the algorithm for REF protection based on phase comparison in time domain is tested. On the basis of the obtained results, a method of adjustment of the triggering threshold in order to ensure safe operation during transients, and thereby improve the algorithm security, has been proposed. The obtained results indicate that power transformer REF protection would be enhanced by using the proposed adjustment of triggering threshold in the algorithm which is based on phase comparison in time domain.

  2. Verification of Embedded Memory Systems using Efficient Memory Modeling

    CERN Document Server

    Ganai, Malay K; Ashar, Pranav

    2011-01-01

    We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...

  3. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  4. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    Science.gov (United States)

    Fragoso, Margarida; Wen, Ning; Kumar, Sanath; Liu, Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J.

    2010-08-01

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m3 MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within ±4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC

  5. Verification tests for a solar-heating system

    Science.gov (United States)

    1980-01-01

    Report describes method of verification of solar space heating and hot-water systems using similarity comparison, mathematical analysis, inspections, and tests. Systems, subsystems, and components were tested for performance, durability, safety, and other factors. Tables and graphs compliment test materials.

  6. Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...

  7. Total skin electron therapy treatment verification: Monte Carlo simulation and beam characteristics of large non-standard electron fields

    Energy Technology Data Exchange (ETDEWEB)

    Pavon, Ester Carrasco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Sanchez-Doblado, Francisco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Leal, Antonio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Capote, Roberto [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Lagares, Juan Ignacio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Perucha, Maria [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Arrans, Rafael [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain)

    2003-09-07

    Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm{sup 2} electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm{sup 2} field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm{sup 2} open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R{sub 50}, R{sub 80}, R{sub 100}, R{sub p}, E{sub 0}) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results

  8. System verification and validation: a fundamental systems engineering task

    Science.gov (United States)

    Ansorge, Wolfgang R.

    2004-09-01

    Systems Engineering (SE) is the discipline in a project management team, which transfers the user's operational needs and justifications for an Extremely Large Telescope (ELT) -or any other telescope-- into a set of validated required system performance characteristics. Subsequently transferring these validated required system performance characteris-tics into a validated system configuration, and eventually into the assembled, integrated telescope system with verified performance characteristics and provided it with "objective evidence that the particular requirements for the specified intended use are fulfilled". The latter is the ISO Standard 8402 definition for "Validation". This presentation describes the verification and validation processes of an ELT Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50 %. The intention of this article is, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project by explaining them the importance of Systems Engineering in the AIV and validation processes.

  9. Diffusion Monte Carlo: Exponentially inefficent for large systems?

    CERN Document Server

    Nemec, Norbert

    2009-01-01

    The computational cost of a Monte Carlo algorithm can only be meaningfully discussed when taking into account the magnitude of the resulting statistical error. Aiming for a fixed error per particle, we study the scaling behavior of the diffusion Monte Carlo method for large quantum systems. We identify the correlation within the population of walkers as the dominant scaling factor for large systems. While this factor is negligible for small and medium sized systems that are typically studied, it ultimately shows exponential scaling beyond system sizes that can be estimated straightforwardly for each specific system.

  10. SU-E-T-384: Experimental Verification of a Monte Carlo Linear Accelerator Model Using a Radiochromic Film Stack Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    McCaw, T; Culberson, W; DeWerd, L [University of Wisconsin Medical Radiation Research Center, Madison, WI (United States)

    2014-06-01

    Purpose: To experimentally verify a Monte Carlo (MC) linear accelerator model for the simulation of intensity-modulated radiation therapy (IMRT) treatments of moving targets. Methods: A Varian Clinac™ 21EX linear accelerator was modeled using the EGSnrc user code BEAMnrc. The mean energy, radial-intensity distribution, and divergence of the electron beam incident on the bremsstrahlung target were adjusted to achieve agreement between simulated and measured percentage-depth-dose and transverse field profiles for a 6 MV beam. A seven-field step-and-shoot IMRT lung procedure was prepared using Varian Eclipse™ treatment planning software. The plan was delivered using a Clinac™ 21EX linear accelerator and measured with a Gafchromic™ EBT2 film stack dosimeter (FSD) in two separate static geometries: within a cylindrical water-equivalent-plastic phantom and within an anthropomorphic chest phantom. Two measurements were completed in each setup. The dose distribution for each geometry was simulated using the EGSnrc user code DOSXYZnrc. MC geometries of the treatment couch, cylindrical phantom, and chest phantom were developed by thresholding CT data sets using MATLAB™. The FSD was modeled as water. The measured and simulated dose distributions were normalized to the median dose within the FSD. Results: Using an electron beam with a mean energy of 6.05 MeV, a Gaussian radial-intensity distribution with a full width at half maximum of 1.5 mm, and a divergence of 0°, the measured and simulated dose profiles agree within 1.75% and 1 mm. Measured and simulated dose distributions within both the cylindrical and chest phantoms agree within 3% over 94% of the FSD volume. The overall uncertainty in the FSD measurements is 3.1% (k=1). Conclusion: MC simulations agree with FSD measurements within measurement uncertainty, thereby verifying the accuracy of the linear accelerator model for the simulation of IMRT treatments of static geometries. The experimental verification

  11. Standard guide for acoustic emission system performance verification

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 System performance verification methods launch stress waves into the examination article on which the sensor is mounted. The resulting stress wave travels in the examination article and is detected by the sensor(s) in a manner similar to acoustic emission. 1.2 This guide describes methods which can be used to verify the response of an Acoustic Emission system including sensors, couplant, sensor mounting devices, cables and system electronic components. 1.3 Acoustic emission system performance characteristics, which may be evaluated using this document, include some waveform parameters, and source location accuracy. 1.4 Performance verification is usually conducted prior to beginning the examination. 1.5 Performance verification can be conducted during the examination if there is any suspicion that the system performance may have changed. 1.6 Performance verification may be conducted after the examination has been completed. 1.7 The values stated in SI units are to be regarded as standard. No other u...

  12. FORMAL VERIFICATION OF REAL TIME DISTRIBUTED SYSTEMS USING B METHOD

    Directory of Open Access Journals (Sweden)

    AYAMAN M. WAHBA,

    2011-04-01

    Full Text Available Throughout the previous years, the complexity and size of digital systems has increased dramatically, as a result design flow phases changed a lot. Simulation used to be the most common procedure to assure the correctness of a system under design, but it cannot exhaustively examine all the execution scenarios of the system. A different approach to validate a system by formally reasoning the system behavior is Formal verification, where the system implementation is checked against the requirements or the properties to be satisfied. The most common paradigms are based on theorem proving, model checking and language containment. This paper presents an application of the B method to the formalization and verification of a simplified flight control system, as an example of a system consisting of a number of distributed computing devices that are interconnected together through digital communication channels.

  13. Verification and Validation of Model-Based Autonomous Systems

    Science.gov (United States)

    Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2001-01-01

    This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

  14. Formal Verification of the Danish Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    In this paper, we present a method for formal verification of the new Danish railway interlocking systems. We made a generic and reconfigurable model of the behaviors and high-level safety properties of non-collision and nonderailment. This model accommodates sequential release – a new feature in...... railway networks of industrial size....

  15. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  16. Applications of quantum Monte Carlo methods in condensed systems

    CERN Document Server

    Kolorenc, Jindrich

    2010-01-01

    The quantum Monte Carlo methods represent a powerful and broadly applicable computational tool for finding very accurate solutions of the stationary Schroedinger equation for atoms, molecules, solids and a variety of model systems. The algorithms are intrinsically parallel and are able to take full advantage of the present-day high-performance computing systems. This review article concentrates on the fixed-node/fixed-phase diffusion Monte Carlo method with emphasis on its applications to electronic structure of solids and other extended many-particle systems.

  17. ECG based biometrics verification system using LabVIEW

    OpenAIRE

    Sunil Kumar Singla; Ankit Sharma

    2010-01-01

    Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc.) are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print ca...

  18. Efficiency of Monte Carlo sampling in chaotic systems.

    Science.gov (United States)

    Leitão, Jorge C; Lopes, J M Viana Parente; Altmann, Eduardo G

    2014-11-01

    In this paper we investigate how the complexity of chaotic phase spaces affect the efficiency of importance sampling Monte Carlo simulations. We focus on flat-histogram simulations of the distribution of finite-time Lyapunov exponent in a simple chaotic system and obtain analytically that the computational effort: (i) scales polynomially with the finite time, a tremendous improvement over the exponential scaling obtained in uniform sampling simulations; and (ii) the polynomial scaling is suboptimal, a phenomenon known as critical slowing down. We show that critical slowing down appears because of the limited possibilities to issue a local proposal in the Monte Carlo procedure when it is applied to chaotic systems. These results show how generic properties of chaotic systems limit the efficiency of Monte Carlo simulations.

  19. Meaningful timescales from Monte Carlo simulations of molecular systems

    CERN Document Server

    Costa, Liborio I

    2016-01-01

    A new Markov Chain Monte Carlo method for simulating the dynamics of molecular systems with atomistic detail is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.

  20. Conducting Verification and Validation of Multi- Agent Systems

    Directory of Open Access Journals (Sweden)

    Nedhal Al Saiyd

    2012-10-01

    Full Text Available Verification and Validation (V&V is a series of activities ,technical and managerial ,which performed bysystem tester not the system developer in order to improve the system quality ,system reliability andassure that product satisfies the users operational needs. Verification is the assurance that the products ofa particular development phase are consistent with the requirements of that phase and preceding phase(s,while validation is the assurance that the final product meets system requirements. an outside agency canbe used to performed V&V, which is indicate by Independent V&V, or IV&V, or by a group within theorganization but not the developer, referred to as Internal V&V. Use of V&V often accompanies testing,can improve quality assurance, and can reduce risk. This paper putting guidelines for performing V&V ofMulti-Agent Systems (MAS.

  1. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used to...... specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...... from a CML model. Our hope is that the resulting proofs can then be used to formally verify the conformance of each constituent system, which is turn would result in a dependable SoS....

  2. Practical mask inspection system with printability and pattern priority verification

    Science.gov (United States)

    Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka

    2011-05-01

    Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.

  3. Pixel Based Off-line Signature Verification System

    Directory of Open Access Journals (Sweden)

    Anik Barua

    2015-01-01

    Full Text Available The verification of handwritten signatures is one of the oldest and the most popular authentication methods all around the world. As technology improved, different ways of comparing and analyzing signatures become more and more sophisticated. Since the early seventies, people have been exploring how computers can fully take over the task of signature verification and tried different methods. However, none of them is satisfactory enough and time consuming too. Therefore, our proposed pixel based offline signature verification system is one of the fastest and easiest ways to authenticate any handwritten signature we have ever found. For signature acquisition, we have used scanner. Then we have divided the signature image into 2D array and calculated the hexadecimal RGB value of each pixel. After that, we have calculated the total percentage of matching. If the percentage of matching is more than 90, the signature is considered as valid otherwise invalid. We have experimented on more than 35 signatures and the result of our experiment is quite impressive. We have made the whole system web based so that the signature can be verified from anywhere. The average execution time for signature verification is only 0.00003545 second only.

  4. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  5. Verification of heterogeneous multi-agent system using MCMAS

    Science.gov (United States)

    Choi, Jiyoung; Kim, Seungkeun; Tsourdos, Antonios

    2015-03-01

    The focus of the paper is how to model autonomous behaviours of heterogeneous multi-agent systems such that it can be verified that they will always operate within predefined mission requirements and constraints. This is done by using formal methods with an abstraction of the behaviours modelling and model checking for their verification. Three case studies are presented to verify the decision-making behaviours of heterogeneous multi-agent system using a convoy mission scenario. The multi-agent system in a case study has been extended by increasing the number of agents and function complexity gradually. For automatic verification, model checker for multi-agent systems (MCMAS) is adopted due to its novel capability to accommodate the multi-agent system and successfully verifies the targeting behaviours of the team-level autonomous systems. The verification results help retrospectively the design of decision-making algorithms improved by considering additional agents and behaviours during three steps of scenario modification. Consequently, the last scenario deals with the system composed of a ground control system, two unmanned aerial vehicles, and four unmanned ground vehicles with fault-tolerant and communication relay capabilities.

  6. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  7. Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor, Rev. 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun

    2005-12-15

    Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.

  8. Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun

    2005-07-15

    Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.

  9. Measurability and Safety Verification for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger;

    2011-01-01

    Dealing with the interplay of randomness and continuous time is important for the formal verification of many real systems. Considering both facets is especially important for wireless sensor networks, distributed control applications, and many other systems of growing importance. An important tr......, we enhance tool support to work effectively on such general models. Experimental evidence is provided demonstrating the applicability of our approach on three case studies, tackled using a prototypical implementation....

  10. Advanced NSTS propulsion system verification study

    Science.gov (United States)

    Wood, Charles

    1989-01-01

    The merits of propulsion system development testing are discussed. The existing data base of technical reports and specialists is utilized in this investigation. The study encompassed a review of all available test reports of propulsion system development testing for the Saturn stages, the Titan stages, and the Space Shuttle main propulsion system. The knowledge on propulsion system development and system testing available from specialists and managers was also 'tapped' for inclusion.

  11. Monitoring and Commissioning Verification Algorithms for CHP Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brambley, Michael R.; Katipamula, Srinivas; Jiang, Wei

    2008-03-31

    This document provides the algorithms for CHP system performance monitoring and commissioning verification (CxV). It starts by presenting system-level and component-level performance metrics, followed by descriptions of algorithms for performance monitoring and commissioning verification, using the metric presented earlier. Verification of commissioning is accomplished essentially by comparing actual measured performance to benchmarks for performance provided by the system integrator and/or component manufacturers. The results of these comparisons are then automatically interpreted to provide conclusions regarding whether the CHP system and its components have been properly commissioned and where problems are found, guidance is provided for corrections. A discussion of uncertainty handling is then provided, which is followed by a description of how simulations models can be used to generate data for testing the algorithms. A model is described for simulating a CHP system consisting of a micro-turbine, an exhaust-gas heat recovery unit that produces hot water, a absorption chiller and a cooling tower. The process for using this model for generating data for testing the algorithms for a selected set of faults is described. The next section applies the algorithms developed to CHP laboratory and field data to illustrate their use. The report then concludes with a discussion of the need for laboratory testing of the algorithms on a physical CHP systems and identification of the recommended next steps.

  12. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  13. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour...

  14. Simulation of Cone Beam CT System Based on Monte Carlo Method

    CERN Document Server

    Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing

    2014-01-01

    Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.

  15. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  16. Moving the LHCb Monte Carlo Production System to the GRID

    Institute of Scientific and Technical Information of China (English)

    E.vanHerwijnen; P.Mato; 等

    2001-01-01

    The fundamental elemets of the LHCb Monte Carlo production system are described,covering security,Job submission,execution,data handling and bookkeeping,An analysis is given of the main requirements for GRID facilities,together with some discussion as to how the GRID can enhance this system.A summary is given of the first experiences in moving the system to a GRID environment.The first planning for interfacing the LHCb OO framework to GRID services is outlined.

  17. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  18. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  19. Dynamic Verification of a Large Discrete System

    OpenAIRE

    Gunnarsson, Johan; Germundsson, Roger

    1996-01-01

    Symbolic algebraic analysis techniques are applied to the landing gear subsystem in the Swedish fighter aircraft, JAS 39 Gripen. Our methods are based on polynomials over finite fields (with Boolean algebra and propositional logic as special cases). Polynomials are used to represent the basic dynamic equations for the processes (controller and plant) as well as static properties of these. Temporal algebra (or temporal logic) is used to represent specifications of system behaviour. These speci...

  20. On the Symbolic Verification of Timed Systems

    DEFF Research Database (Denmark)

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif;

    1999-01-01

    This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition or by ad......This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...

  1. Airworthiness Compliance Verification Method Based on Simulation of Complex System

    Institute of Scientific and Technical Information of China (English)

    XU Haojun; LIU Dongliang; XUE Yuan; ZHOU Li; MIN Guilong

    2012-01-01

    A study is conducted on a new airworthiness compliance verification method based on pilot-aircraft-environment complex system simulation.Verification scenarios are established by “block diagram” method based on airworthiness criteria..A pilot-aircraft-environment complex model is set up and a virtual flight testing method based on connection of MATLAB/Simulink and Flightgear is proposed.Special researches are conducted on the modeling of pilot manipulation stochastic parameters and manipulation in critical situation.Unfavorable flight factors of certain scenario are analyzed,and reliability modeling of important system is researched.A distribution function of small probability event and the theory on risk probability measurement are studied.Nonlinear function is used to depict the relationship between the cumulative probability and the extremum of the critical parameter.A synthetic evaluation model is set up,modified genetic algorithm (MGA) is applied to ascertaining the distribution parameter in the model,and amore reasonable result is obtained.A clause about vehicle control functions (VCFs) verification in MIL-HDBK-516B is selected as an example to validate the practicability of the method.

  2. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  3. Verification of Mixed-Signal Systems with Affine Arithmetic Assertions

    Directory of Open Access Journals (Sweden)

    Carna Radojicic

    2013-01-01

    Full Text Available Embedded systems include an increasing share of analog/mixed-signal components that are tightly interwoven with functionality of digital HW/SW systems. A challenge for verification is that even small deviations in analog components can lead to significant changes in system properties. In this paper we propose the combination of range-based, semisymbolic simulation with assertion checking. We show that this approach combines advantages, but as well some limitations, of multirun simulations with formal techniques. The efficiency of the proposed method is demonstrated by several examples.

  4. LHC Beam Loss Monitoring System Verification Applications

    CERN Document Server

    Dehning, B; Zamantzas, C; Jackson, S

    2011-01-01

    The LHC Beam Loss Mon­i­tor­ing (BLM) sys­tem is one of the most com­plex in­stru­men­ta­tion sys­tems de­ployed in the LHC. In ad­di­tion to protecting the col­lid­er, the sys­tem also needs to pro­vide a means of di­ag­nos­ing ma­chine faults and de­liv­er a feed­back of loss­es to the control room as well as to sev­er­al sys­tems for their setup and analysis. It has to trans­mit and pro­cess sig­nals from al­most 4’000 mon­i­tors, and has near­ly 3 mil­lion con­fig­urable pa­ram­e­ters. The system was de­signed with re­li­a­bil­i­ty and avail­abil­i­ty in mind. The spec­i­fied op­er­a­tion and the fail-safe­ty stan­dards must be guar­an­teed for the sys­tem to per­form its func­tion in pre­vent­ing su­per­con­duc­tive mag­net de­struc­tion caused by par­ti­cle flux. Main­tain­ing the ex­pect­ed re­li­a­bil­i­ty re­quires ex­ten­sive test­ing and ver­i­fi­ca­tion. In this paper we re­port our most re­cent ad­di­t...

  5. Diffusion Monte Carlo calculations of three-body systems

    Institute of Scientific and Technical Information of China (English)

    L(U) Meng-Jiao; REN Zhong-Zhou; LIN Qi-Hu

    2012-01-01

    The application of the diffusion Monte Carlo algorithm in three-body systems is studied.We develop a program and use it to calculate the property of various three-body systems.Regular Coulomb systems such as atoms,molecules,and ions are investigated.The calculation is then extended to exotic systems where electrons are replaced by muons.Some nuclei with neutron halos are also calculated as three-body systems consisting of a core and two external nucleons.Our results agree well with experiments and others' work.

  6. An Integrated Design and Verification Methodology for Reconfigurable Multimedia Systems

    CERN Document Server

    Borgatti, M; Rossi, U; Lambert, J -L; Moussa, I; Fummi, F; Pravadelli, G

    2011-01-01

    Recently a lot of multimedia applications are emerging on portable appliances. They require both the flexibility of upgradeable devices (traditionally software based) and a powerful computing engine (typically hardware). In this context, programmable HW and dynamic reconfiguration allow novel approaches to the migration of algorithms from SW to HW. Thus, in the frame of the Symbad project, we propose an industrial design flow for reconfigurable SoC's. The goal of Symbad consists of developing a system level design platform for hardware and software SoC systems including formal and semi-formal verification techniques.

  7. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  8. Fixed-Node Diffusion Monte Carlo of Lithium Systems

    CERN Document Server

    Rasch, Kevin

    2015-01-01

    We study lithium systems over a range of number of atoms, e.g., atomic anion, dimer, metallic cluster, and body-centered cubic crystal by the diffusion Monte Carlo method. The calculations include both core and valence electrons in order to avoid any possible impact by pseudo potentials. The focus of the study is the fixed-node errors, and for that purpose we test several orbital sets in order to provide the most accurate nodal hyper surfaces. We compare our results to other high accuracy calculations wherever available and to experimental results so as to quantify the the fixed-node errors. The results for these Li systems show that fixed-node quantum Monte Carlo achieves remarkably high accuracy total energies and recovers 97-99 % of the correlation energy.

  9. Experimental verification of a commercial Monte Carlo-based dose calculation module for high-energy photon beams

    Science.gov (United States)

    Künzler, Thomas; Fotina, Irina; Stock, Markus; Georg, Dietmar

    2009-12-01

    The dosimetric performance of a Monte Carlo algorithm as implemented in a commercial treatment planning system (iPlan, BrainLAB) was investigated. After commissioning and basic beam data tests in homogenous phantoms, a variety of single regular beams and clinical field arrangements were tested in heterogeneous conditions (conformal therapy, arc therapy and intensity-modulated radiotherapy including simultaneous integrated boosts). More specifically, a cork phantom containing a concave-shaped target was designed to challenge the Monte Carlo algorithm in more complex treatment cases. All test irradiations were performed on an Elekta linac providing 6, 10 and 18 MV photon beams. Absolute and relative dose measurements were performed with ion chambers and near tissue equivalent radiochromic films which were placed within a transverse plane of the cork phantom. For simple fields, a 1D gamma (γ) procedure with a 2% dose difference and a 2 mm distance to agreement (DTA) was applied to depth dose curves, as well as to inplane and crossplane profiles. The average gamma value was 0.21 for all energies of simple test cases. For depth dose curves in asymmetric beams similar gamma results as for symmetric beams were obtained. Simple regular fields showed excellent absolute dosimetric agreement to measurement values with a dose difference of 0.1% ± 0.9% (1 standard deviation) at the dose prescription point. A more detailed analysis at tissue interfaces revealed dose discrepancies of 2.9% for an 18 MV energy 10 × 10 cm2 field at the first density interface from tissue to lung equivalent material. Small fields (2 × 2 cm2) have their largest discrepancy in the re-build-up at the second interface (from lung to tissue equivalent material), with a local dose difference of about 9% and a DTA of 1.1 mm for 18 MV. Conformal field arrangements, arc therapy, as well as IMRT beams and simultaneous integrated boosts were in good agreement with absolute dose measurements in the

  10. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  11. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  12. An eclectic quadrant of rule based system verification: work grounded in verification of fuzzy rule bases

    OpenAIRE

    Viaene, Stijn; Wets, G.; Vanthienen, Jan; Dedene, Guido

    1999-01-01

    In this paper, we used a research approach based on grounded theory in order to classify methods proposed in literature that try to extend the verification of classical rule bases to the case of fuzzy knowledge modeling. Within this area of verification we identify two dual lines of thought respectively leading to what is termed respectively static and dynamic anomaly detection methods. The major outcome of the confrontation of both approaches is that their results, most often stated in terms...

  13. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  14. Exact Verification of Hybrid Systems Based on Bilinear SOS Representation

    CERN Document Server

    Yang, Zhengfeng; Lin, Wang

    2012-01-01

    In this paper, we address the problem of safety verification of nonlinear hybrid systems and stability analysis of nonlinear autonomous systems. A hybrid symbolic-numeric method is presented to compute exact inequality invariants of hybrid systems and exact estimates of regions of attraction of autonomous systems efficiently. Some numerical invariants of a hybrid system or an estimate of region of attraction can be obtained by solving a bilinear SOS program via PENBMI solver or iterative method, then the modified Newton refinement and rational vector recovery techniques are applied to obtain exact polynomial invariants and estimates of regions of attraction with rational coefficients. Experiments on some benchmarks are given to illustrate the efficiency of our algorithm.

  15. Verification of Interdomain Routing System Based on Formal Methods

    Institute of Scientific and Technical Information of China (English)

    ZANG Zhiyuan; LUO Guiming; YIN Chongyuan

    2009-01-01

    In networks,the stable path problem (SPP) usually results in oscillations in interdomain systems and may cause systems to become unstable.With the rapid development of internet technology,the occurrence of SPPs in interdomain systems has quite recently become a significant focus of research.A framework for checking SPPs is presented in this paper with verification of an interdomain routing system using formal methods and the NuSMV software.Sufficient conditions and necessary conditions for determining SPP occurrence are presented with proof of the method's effectiveness.Linear temporal logic was used to model an interdomain routing system and its properties were analyzed.An example is included to demonstrate the method's reliability.

  16. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, A.H.; Wupper, H.; Boon, M.

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, w

  17. Verification of Monte-Carlo transport codes FLUKA, GEANT4 and SHIELD for radiation protection purposes at relativistic heavy-ion accelerators

    International Nuclear Information System (INIS)

    The crucial problem for radiation shielding design at heavy-ion accelerator facilities with beam energies to several GeV/n is the source term problem. Experimental data on double differential neutron yields from thick target irradiated with high-energy uranium nuclei are lacking. At present, there are not many Monte-Carlo multipurpose codes that can work with primary high-energy uranium nuclei. These codes use different physical models for simulation of nucleus-nucleus reactions. Therefore, verification of the codes with available experimental data is very important for selection of the most reliable code for practical tasks. This paper presents comparisons of the FLUKA, GEANT4 and SHIELD codes simulations with the experimental data on neutron production at 1 GeV/n 238U beam interaction with thick Fe target

  18. FAST CONVERGENT MONTE CARLO RECEIVER FOR OFDM SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Wu Lili; Liao Guisheng; Bao Zheng; Shang Yong

    2005-01-01

    The paper investigates the problem of the design of an optimal Orthogonal Frequency Division Multiplexing (OFDM) receiver against unknown frequency selective fading. A fast convergent Monte Carlo receiver is proposed. In the proposed method, the Markov Chain Monte Carlo (MCMC) methods are employed for the blind Bayesian detection without channel estimation. Meanwhile, with the exploitation of the characteristics of OFDM systems, two methods are employed to improve the convergence rate and enhance the efficiency of MCMC algorithms.One is the integration of the posterior distribution function with respect to the associated channel parameters, which is involved in the derivation of the objective distribution function; the other is the intra-symbol differential coding for the elimination of the bimodality problem resulting from the presence of unknown fading channels. Moreover, no matrix inversion is needed with the use of the orthogonality property of OFDM modulation and hence the computational load is significantly reduced. Computer simulation results show the effectiveness of the fast convergent Monte Carlo receiver.

  19. A new method for commissioning Monte Carlo treatment planning systems

    Science.gov (United States)

    Aljarrah, Khaled Mohammed

    2005-11-01

    The Monte Carlo method is an accurate method for solving numerical problems in different fields. It has been used for accurate radiation dose calculation for radiation treatment of cancer. However, the modeling of an individual radiation beam produced by a medical linear accelerator for Monte Carlo dose calculation, i.e., the commissioning of a Monte Carlo treatment planning system, has been the bottleneck for the clinical implementation of Monte Carlo treatment planning. In this study a new method has been developed to determine the parameters of the initial electron beam incident on the target for a clinical linear accelerator. The interaction of the initial electron beam with the accelerator target produces x-ray and secondary charge particles. After successive interactions in the linac head components, the x-ray photons and the secondary charge particles interact with the patient's anatomy and deliver dose to the region of interest. The determination of the initial electron beam parameters is important for estimating the delivered dose to the patients. These parameters, such as beam energy and radial intensity distribution, are usually estimated through a trial and error process. In this work an easy and efficient method was developed to determine these parameters. This was accomplished by comparing calculated 3D dose distributions for a grid of assumed beam energies and radii in a water phantom with measurements data. Different cost functions were studied to choose the appropriate function for the data comparison. The beam parameters were determined on the light of this method. Due to the assumption that same type of linacs are exactly the same in their geometries and only differ by the initial phase space parameters, the results of this method were considered as a source data to commission other machines of the same type.

  20. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  1. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations.

    Science.gov (United States)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-02-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  2. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  3. Formal Verification for Embedded Systems Design Based on MDE

    Science.gov (United States)

    Do Nascimento, Francisco Assis Moreira; da Silva Oliveira, Marcio Ferreira; Wagner, Flávio Rech

    This work presents a Model Driven Engineering (MDE) approach for the automatic generation of a network of timed automata from the functional specification of an embedded application described using UML class and sequence diagrams. By means of transformations on the UML model of the embedded system, a MOF-based representation for the network of timed automata is automatically obtained, which can be used as input to formal verification tools, as the Uppaal model checker, in order to validate desired functional and temporal properties of the embedded system specification. Since the network of timed automata is automatically generated, the methodology can be very useful for the designer, making easier the debugging and formal validation of the system specification. The paper describes the defined transformations between models, which generate the network of timed automata as well as the textual input to the Uppaal model checker, and illustrates the use of the methodology with a case study to show the effectiveness of the approach.

  4. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    OpenAIRE

    Tseng, Kuo-Kun; Zeng, Fufu; Ip, W. H.; Wu, C.H.

    2016-01-01

    With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental...

  5. Experimental verification of a Monte Carlo-based MLC simulation model for IMRT dose calculations in heterogeneous media

    Science.gov (United States)

    Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.

    2008-02-01

    IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).

  6. An evaluation of the management system verification pilot at Hanford

    Energy Technology Data Exchange (ETDEWEB)

    BRIGGS, C.R.

    1998-11-12

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview.

  7. Verification of the model of a photon beam of 6 MV in a Monte Carlo planning comparison with collapsed cone in in homogeneous medium; Verificacion del modelado de un haz de fotones de 6 MV en un planificador Monte Carlo. Comparacion con Collapsed Cone en medio no homogeneo

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.

    2013-07-01

    We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)

  8. On Sensor Data Verification for Participatory Sensing Systems

    Directory of Open Access Journals (Sweden)

    Diego Mendez

    2013-03-01

    Full Text Available In this paper we study the problem of sensor data verification in Participatory Sensing (PS systems using an air quality/pollution monitoring application as a validation example. Data verification, in the context of PS, consists of the process of detecting and removing spatial outliers to properly reconstruct the variables of interest. We propose, implement, and test a hybrid neighborhood-aware algorithm for outlier detection that considers the uneven spatial density of the users, the number of malicious users, the level of conspiracy, and the lack of accuracy and malfunctioning sensors. The algorithm utilizes the Delaunay triangulation and Gaussian Mixture Models to build neighborhoods based on the spatial and non-spatial attributes of each location. This neighborhood definition allows us to demonstrate thatit is not necessary to apply accurate but computationally expensive estimators to the entire dataset to obtain good results, as equally accurate but computationally cheaper methods can also be applied to part of the data and obtain good results as well. Our experimental results show that our hybrid algorithm performs as good as the best estimator while reducing the execution time considerably.

  9. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  10. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  11. Multi-way Monte Carlo Method for Linear Systems

    OpenAIRE

    Wu, Tao; Gleich, David F.

    2016-01-01

    We study the Monte Carlo method for solving a linear system of the form $x = H x + b$. A sufficient condition for the method to work is $\\| H \\| < 1$, which greatly limits the usability of this method. We improve this condition by proposing a new multi-way Markov random walk, which is a generalization of the standard Markov random walk. Under our new framework we prove that the necessary and sufficient condition for our method to work is the spectral radius $\\rho(H^{+}) < 1$, which is a weake...

  12. Computer aided production planning - SWZ system of order verification

    Science.gov (United States)

    Krenczyk, D.; Skolud, B.

    2015-11-01

    SWZ (System of order verification) is a computer implementation of the methodology that support fast decision making on the acceptability of a production order, which allows to determine not the best possible solution, but admissible solution that is possible to find in an acceptable time (feasible solution) and acceptable due to the existing constraints. The methodology uses the propagation of constraints techniques and reduced to test a sequence of arbitrarily selected conditions. Fulfilment of all the conditions (the conjunction) provides the ability to perform production orders. In the paper examples of the application of SWZ system comprising the steps of planning and control is presented. The obtained results allowing the determination of acceptable production flow in the system - determination of the manufacturing system parameters those that ensure execution of orders in time under the resource constraints. SWZ also allows to generate the dispatching rules as a sequence of processing operations for each production resource, performed periodically during the production flow in the system. Furthermore the example of SWZ and simulation system integration is shown. SWZ has been enhanced with a module generating files containing the script code of the system model using the internal language of simulation and visualization system.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ULTRASONIC AQUEOUS CLEANING SYSTEMS, SMART SONIC CORPORATION, SMART SONIC

    Science.gov (United States)

    This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...

  14. Interacting multiagent systems kinetic equations and Monte Carlo methods

    CERN Document Server

    Pareschi, Lorenzo

    2014-01-01

    The description of emerging collective phenomena and self-organization in systems composed of large numbers of individuals has gained increasing interest from various research communities in biology, ecology, robotics and control theory, as well as sociology and economics. Applied mathematics is concerned with the construction, analysis and interpretation of mathematical models that can shed light on significant problems of the natural sciences as well as our daily lives. To this set of problems belongs the description of the collective behaviours of complex systems composed by a large enough number of individuals. Examples of such systems are interacting agents in a financial market, potential voters during political elections, or groups of animals with a tendency to flock or herd. Among other possible approaches, this book provides a step-by-step introduction to the mathematical modelling based on a mesoscopic description and the construction of efficient simulation algorithms by Monte Carlo methods. The ar...

  15. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    Science.gov (United States)

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  16. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.;

    2012-01-01

    of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge...

  17. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  18. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  19. IRIS safety system and equipment design verification test plan

    International Nuclear Information System (INIS)

    The International Reactor Innovative and Secure (IRIS) is an advanced, integral, light-water cooled reactor of medium generating capacity (335 MWe), geared at near term deployment (2012-2015). IRIS is an innovative design that features an integral reactor vessel that contains all the reactor coolant system components, including the steam generators, coolant pumps, pressurizer and heaters, and control rod drive mechanisms; in addition to the: typical core, internals, control rods and neutron reflector. Other IRIS innovations also include a small, high design pressure, spherical steel containment; and a simplified passive safety system concept and equipment features that derive from its unique 'safety-by-design' IM philosophy. The IRIS ('safety-by-design')TM approach not only improves safety, but it also reduces the overall cost by allowing a significant reduction and simplification in safety systems. Moreover, IRIS improved safety supports licensing the power plant without the need for off-site emergency response planning an objective which is part of the pre-application with NRC and is also is being pursued in collaboration with IAEA. The IRIS innovative integral reactor coolant system design, as well as its innovative ('safety-by-design')TM approach features, has resulted in the need for new safety analyses and new equipment design and qualification, in order to successfully license the plant. Therefore, the IRIS design team has developed a test plan that will provide the necessary data for safety analyses verification as well as the demonstration of equipment manufacturing feasibility and operation. This paper will present the 'IRIS Safety System and Equipment Design Verification Test Plan' which develops and confirms the operation of all the IRIS unique features, and includes component manufacturing feasibility tests, component separate effects tests, component qualification tests, and integral effects tests. These tests will also provide the data necessary to

  20. Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System

    Science.gov (United States)

    Parrish, Lewis M.

    2009-01-01

    NASA Kennedy Space Center (KSC) recently entered into a nonexclusive license agreement with Applied Cryogenic Solutions (ACS), Inc. (Galveston, TX) to commercialize its Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System technology. This technology, developed by KSC, is a critical component of processes being developed and commercialized by ACS to replace current mechanical and chemical cleaning and descaling methods used by numerous industries. Pilot trials on heat exchanger tubing components have shown that the ACS technology provides for: Superior cleaning in a much shorter period of time. Lower energy and labor requirements for cleaning and de-scaling uper.ninih. Significant reductions in waste volumes by not using water, acidic or basic solutions, organic solvents, or nonvolatile solid abrasives as components in the cleaning process. Improved energy efficiency in post-cleaning heat exchanger operations. The ACS process consists of a spray head containing supersonic converging/diverging nozzles, a source of liquid gas; a novel, proprietary pumping system that permits pumping liquid nitrogen, liquid air, or supercritical carbon dioxide to pressures in the range of 20,000 to 60,000 psi; and various hoses, fittings, valves, and gauges. The size and number of nozzles can be varied so the system can be built in configurations ranging from small hand-held spray heads to large multinozzle cleaners. The system also can be used to verify if a part has been adequately cleaned.

  1. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    Science.gov (United States)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  2. Algorithm Verification for A TLD Personal Dosimetry System

    International Nuclear Information System (INIS)

    Dose algorithms are used in thermoluminescence personnel dosimetry for the interpretation of the dosimeter response in terms of equivalent dose. In the present study an Automated Harshaw 6600 reader was rigorously tested prior to use for dose calculation algorithm according to the standard established by the US Department of Energy Laboratory Accreditation Program (DOELAP). Also, manual Harshaw 4500 reader was used along with the ICRU slab phantom and the RANDO phantom in experimentally determining the photon personal doses in terms of deep dose, Hp(10), shallow dose, Hp(0.07), and eye lens dose, Hp(3),. Also, a Monte Carlo simulation program (VMC-dc) free code was used to simulate RANDO phantom irradiation process. The accuracy of the automated system lies well within DOELAP tolerance limits in all test categories

  3. Automated hardware-software system for LED's verification and certification

    Science.gov (United States)

    Chertov, Aleksandr N.; Gorbunova, Elena V.; Peretyagin, Vladimir S.; Vakulenko, Anatolii D.

    2012-10-01

    Scientific and technological progress of recent years in the production of the light emitting diodes (LEDs) has led to the expansion of areas of their application from the simplest systems to high precision lighting devices used in various fields of human activity. However, for all technology development at the present time it is very difficult to choose one or another brand of LEDs for realization of concrete devices designed for the implementation of high precision spatial and color measurements of various objects. In the world there are many measurement instruments for determining the various parameters of LEDs, but none of them are not capable to estimate comprehensively the LEDs spatial, spectral, and color parameters with the necessary accuracy and speed. This problem can be solved by using an automated hardware-software system for LED's verification and certification, developed by specialists of the OEDS chair of National Research University ITMO in Russia. The paper presents the theoretical aspects of the analysis of LED's spatial, spectral and color parameters by using mentioned of automated hardware-software system. The article also presents the results of spatial, spectral, and color parameters measurements of some LEDs brands.

  4. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    Energy Technology Data Exchange (ETDEWEB)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed.

  5. Verification of Information Flow in Agent-Based Systems

    Science.gov (United States)

    Sabri, Khair Eddin; Khedri, Ridha; Jaskolka, Jason

    Analyzing information flow is beneficial for ensuring the satisfiability of security policies during the exchange of information between the agents of a system. In the literature, models such as Bell-LaPadula model and the Chinese Wall model are proposed to capture and govern the exchange of information among agents. Also, we find several verification techniques for analyzing information flow within programs or multi-agent systems. However, these models and techniques assume the atomicity of the exchanged information, which means that the information cannot be decomposed or combined with other pieces of information. Also, the policies of their models prohibit any transfer of information from a high level agent to a low level agent. In this paper, we propose a technique that relaxes these assumptions. Indeed, the proposed technique allows classifying information into frames and articulating finer granularity policies that involve information, its elements, or its frames. Also, it allows for information manipulation through several operations such as focusing and combining information. Relaxing the atomicity of information assumption permits an analysis that takes into account the ability of an agent to link elements of information in order to evolve its knowledge.

  6. Verification of component mode techniques for flexible multibody systems

    Science.gov (United States)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  7. Verification of a Monte-Carlo planetary surface radiation environment model using gamma-ray data from Lunar Prospector and 2001 Mars Odyssey

    Energy Technology Data Exchange (ETDEWEB)

    Skidmore, M.S., E-mail: mss16@star.le.ac.u [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom); Ambrosi, R.M. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom)

    2010-01-01

    Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the gamma-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include gamma-ray spectroscopy, gamma-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled gamma-ray data is in good agreement with gamma-ray data obtained by the gamma-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.

  8. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  9. Monte Carlo Code System Development for Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Shim, Hyung Jin; Han, Beom Seok; Park, Ho Jin; Park, Dong Gyu [Seoul National University, Seoul (Korea, Republic of)

    2007-03-15

    We have implemented the composition cell class and the use cell to MCCARD for hierarchy input processing. For the inputs of KALlMER-600 core consisted of 336 assemblies, we require the geometric data of 91,056 pin cells. Using hierarchy input processing, it was observed that the system geometries are correctly handled with the geometric data of total 611 cells; 2 cells for fuel rods, 2 cells for guide holes, 271 translation cells for rods, and 336 translation cells for assemblies. We have developed monte carlo decay-chain models based on decay chain model of REBUS code for liquid metal reactor analysis. Using developed decay-chain models, the depletion analysis calculations have performed for the homogeneous and heterogeneous model of KALlMER-600. The k-effective for the depletion analysis agrees well with that of REBUS code. and the developed decay chain models shows more efficient performance for time and memories, as compared with the existing decay chain model The chi-square criterion has been developed to diagnose the temperature convergence for the MC TjH feedback calculations. From the application results to the KALlMER pin and fuel assembly problem, it is observed that the new criterion works well Wc have applied the high efficiency variance reduction technique by splitting Russian roulette to estimate the PPPF of the KALIMER core at BOC. The PPPF of KALlMER core at BOC is 1.235({+-}0.008). The developed technique shows four time faster calculation, as compared with the existin2 calculation Subject Keywords Monte Carlo

  10. Monte Carlo Alpha Iteration Algorithm for a Subcritical System Analysis

    Directory of Open Access Journals (Sweden)

    Hyung Jin Shim

    2015-01-01

    Full Text Available The α-k iteration method which searches the fundamental mode alpha-eigenvalue via iterative updates of the fission source distribution has been successfully used for the Monte Carlo (MC alpha-static calculations of supercritical systems. However, the α-k iteration method for the deep subcritical system analysis suffers from a gigantic number of neutron generations or a huge neutron weight, which leads to an abnormal termination of the MC calculations. In order to stably estimate the prompt neutron decay constant (α of prompt subcritical systems regardless of subcriticality, we propose a new MC alpha-static calculation method named as the α iteration algorithm. The new method is derived by directly applying the power method for the α-mode eigenvalue equation and its calculation stability is achieved by controlling the number of time source neutrons which are generated in proportion to α divided by neutron speed in MC neutron transport simulations. The effectiveness of the α iteration algorithm is demonstrated for two-group homogeneous problems with varying the subcriticality by comparisons with analytic solutions. The applicability of the proposed method is evaluated for an experimental benchmark of the thorium-loaded accelerator-driven system.

  11. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Directory of Open Access Journals (Sweden)

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  12. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  13. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  14. AVNG SYSTEM SOFTWARE - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated

  15. AVNG System Software-Attribute Verification System with Information Barriers for Mass Isotopic Measurements

    International Nuclear Information System (INIS)

    This report describes the software development for the plutonium attribute verification system - AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.

  16. A domain specific language and methodology for control systems GUI specification, verification and prototyping

    OpenAIRE

    risoldi, matteo; Buchs, Didier

    2007-01-01

    A work-in-progress domain-specific language and methodology for modeling complex control systems GUIs is presented. MDA techniques are applied for language design and verification, simulation and prototyping.

  17. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  18. RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)

    Data.gov (United States)

    Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...

  19. Analytical Methods for Verification and Validation of Adaptive Systems in Safety-Critical Aerospace Applications Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A major challenge of the use of adaptive systems in safety-critical applications is the software life-cycle: requirement engineering through verification and...

  20. Embedded systems handbook embedded systems design and verification

    CERN Document Server

    Zurawski, Richard

    2009-01-01

    Considered a standard industry resource, the Embedded Systems Handbook provided researchers and technicians with the authoritative information needed to launch a wealth of diverse applications, including those in automotive electronics, industrial automated systems, and building automation and control. Now a new resource is required to report on current developments and provide a technical reference for those looking to move the field forward yet again. Divided into two volumes to accommodate this growth, the Embedded Systems Handbook, Second Edition presents a comprehensive view on this area

  1. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  2. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  3. Efficiencies of dynamic Monte Carlo algorithms for off-lattice particle systems with a single impurity

    KAUST Repository

    Novotny, M.A.

    2010-02-01

    The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing Markov Chains method are given. Simulation results are presented to confirm the theoretical efficiencies. © 2010.

  4. Formulation and Application of Quantum Monte Carlo Method to Fractional Quantum Hall Systems

    OpenAIRE

    Suzuki, Sei; Nakajima, Tatsuya

    2003-01-01

    Quantum Monte Carlo method is applied to fractional quantum Hall systems. The use of the linear programming method enables us to avoid the negative-sign problem in the Quantum Monte Carlo calculations. The formulation of this method and the technique for avoiding the sign problem are described. Some numerical results on static physical quantities are also reported.

  5. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  6. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  7. Verification and validation of rulebased systems for Hubble Space Telescope ground support

    Science.gov (United States)

    Vick, Shon; Lindenmayer, Kelly

    1988-01-01

    As rulebase systems become more widely used in operational environments, the focus is on the problems and concerns of maintaining expert systems. In the conventional software model, the verification and validation of a system have two separate and distinct meanings. To validate a system means to demonstrate that the system does what is advertised. The verification process refers to investigating the actual code to identify inconsistencies and redundancies within the logic path. In current literature regarding maintaining rulebased systems, little distinction is made between these two terms. In fact, often the two terms are used interchangeably. Verification and validation of rulebased systems are discussed as separate but equally important aspects of the maintenance phase. Also described are some of the tools and methods that were developed at the Space Telescope Science Institute to aid in the maintenance of the rulebased system.

  8. A BrachyPhantom for verification of dose calculation of HDR brachytherapy planning system

    Energy Technology Data Exchange (ETDEWEB)

    Austerlitz, C. [Clinica Diana Campos, Recife, PE 52020-030 (Brazil); Campos, C. A. T. [Pontifícia Universidade Católica do Rio de Janeiro, RJ 22451-900 (Brazil)

    2013-11-15

    Purpose: To develop a calibration phantom for {sup 192}Ir high dose rate (HDR) brachytherapy units that renders possible the direct measurement of absorbed dose to water and verification of treatment planning system.Methods: A phantom, herein designated BrachyPhantom, consists of a Solid Water™ 8-cm high cylinder with a diameter of 14 cm cavity in its axis that allows the positioning of an A1SL ionization chamber with its reference measuring point at the midheight of the cylinder's axis. Inside the BrachyPhantom, at a 3-cm radial distance from the chamber's reference measuring point, there is a circular channel connected to a cylindrical-guide cavity that allows the insertion of a 6-French flexible plastic catheter from the BrachyPhantom surface. The PENELOPE Monte Carlo code was used to calculate a factor, P{sub sw}{sup lw}, to correct the reading of the ionization chamber to a full scatter condition in liquid water. The verification of dose calculation of a HDR brachytherapy treatment planning system was performed by inserting a catheter with a dummy source in the phantom channel and scanning it with a CT. The CT scan was then transferred to the HDR computer program in which a multiple treatment plan was programmed to deliver a total dose of 150 cGy to the ionization chamber. The instrument reading was then converted to absorbed dose to water using the N{sub gas} formalism and the P{sub sw}{sup lw} factor. Likewise, the absorbed dose to water was calculated using the source strength, S{sub k}, values provided by 15 institutions visited in this work.Results: A value of 1.020 (0.09%, k= 2) was found for P{sub sw}{sup lw}. The expanded uncertainty in the absorbed dose assessed with the BrachyPhantom was found to be 2.12% (k= 1). To an associated S{sub k} of 27.8 cGy m{sup 2} h{sup −1}, the total irradiation time to deliver 150 cGy to the ionization chamber point of reference was 161.0 s. The deviation between the absorbed doses to water assessed with

  9. Highly Efficient Monte-Carlo for Estimating the Unavailability of Markov Dynamic System1)

    Institute of Scientific and Technical Information of China (English)

    XIAOGang; DENGLi; ZHANGBen-Ai; ZHUJian-Shi

    2004-01-01

    Monte Carlo simulation has become an important tool for estimating the reliability andavailability of dynamic system, since conventional numerical methods are no longer efficient whenthe size of the system to solve is large. However, evaluating by a simulation the probability of oc-currence of very rare events means playing a very large number of histories of the system, whichleads to unacceptable computing time. Highly efficient Monte Carlo should be worked out. In thispaper, based on the integral equation describing state transitions of Markov dynamic system, a u-niform Monte Carlo for estimating unavailability is presented. Using free-flight estimator, directstatistical estimation Monte Carlo is achieved. Using both free-flight estimator and biased proba-bility space of sampling, weighted statistical estimation Monte Carlo is also achieved. Five MonteCarlo schemes, including crude simulation, analog simulation, statistical estimation based oncrude and analog simulation, and weighted statistical estimation, are used for calculating the un-availability of a repairable Con/3/30 : F system. Their efficiencies are compared with each other.The results show the weighted statistical estimation Monte Carlo has the smallest variance and thehighest efficiency in very rare events simulation.

  10. Simulating Strongly Correlated Electron Systems with Hybrid Monte Carlo

    Institute of Scientific and Technical Information of China (English)

    LIU Chuan

    2000-01-01

    Using the path integral representation, the Hubbard and the periodic Anderson model on D-dimensional cubic lattice are transformed into field theories of fermions in D + 1 dimensions. These theories at half-filling possess a positive definite real symmetry fermion matrix and can be simulated using the hybrid Monte Carlo method.

  11. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  12. Use of metaknowledge in the verification of knowledge-based systems

    Science.gov (United States)

    Morell, Larry J.

    1989-01-01

    Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.

  13. Quasi-Monte Carlo methods for lattice systems: a first look

    CERN Document Server

    Jansen, K; Nube, A; Griewank, A; Müller-Preussker, M

    2013-01-01

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like 1/Sqrt(N), where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to 1/N. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  14. Quasi-Monte Carlo methods for lattice systems. A first look

    International Nuclear Information System (INIS)

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N-1/2, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N-1. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  15. Modeling and Verification of Distributed Generation and Voltage Regulation Equipment for Unbalanced Distribution Power Systems; Annual Subcontract Report, June 2007

    Energy Technology Data Exchange (ETDEWEB)

    Davis, M. W.; Broadwater, R.; Hambrick, J.

    2007-07-01

    This report summarizes the development of models for distributed generation and distribution circuit voltage regulation equipment for unbalanced power systems and their verification through actual field measurements.

  16. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  17. Subtle Monte Carlo Updates in Dense Molecular Systems

    DEFF Research Database (Denmark)

    Bottaro, Sandro; Boomsma, Wouter; Johansson, Kristoffer E.;

    2012-01-01

    as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results...... suggest our method as a valuable tool in the study of molecules in atomic detail, offering a potential alternative to molecular dynamics for probing long time-scale conformational transitions.......Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce...

  18. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  19. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    Science.gov (United States)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  20. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    Science.gov (United States)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  1. Cell-veto Monte Carlo algorithm for long-range systems

    Science.gov (United States)

    Kapfer, Sebastian C.; Krauth, Werner

    2016-09-01

    We present a rigorous efficient event-chain Monte Carlo algorithm for long-range interacting particle systems. Using a cell-veto scheme within the factorized Metropolis algorithm, we compute each single-particle move with a fixed number of operations. For slowly decaying potentials such as Coulomb interactions, screening line charges allow us to take into account periodic boundary conditions. We discuss the performance of the cell-veto Monte Carlo algorithm for general inverse-power-law potentials, and illustrate how it provides a new outlook on one of the prominent bottlenecks in large-scale atomistic Monte Carlo simulations.

  2. Compositional Design and Verification of a Multi-Agent System for One-to-Many Negotiation

    NARCIS (Netherlands)

    Brazier, F.M.T.; Cornelissen, F.; Gustavsson, R.; Jonker, C.M.; Lindeberg, O.; Polak, B.; Treur, J.

    1998-01-01

    A compositional verification method for multi-agent systems is presented and applied to a multi-agent system for one-to-many negotiation in the domain of load balancing of electricity use. Advantages of the method are that the complexity of the

  3. Compositional verification of a multi-agent system for one-to-many negotiation

    NARCIS (Netherlands)

    Brazier, F.M.T.; Cornelissen, F.J.; Gustavsson, R.; Jonker, C.M.; Lindeberg, O.; Polak, B.; Treur, J.

    2004-01-01

    Verification of multi-agent systems hardly occurs in design practice. One of the difficulties is that required properties for a multi-agent system usually refer to multi-agent behaviour which has nontrivial dynamics. To constrain these multi-agent behavioural dynamics, often a form of organisational

  4. 24 CFR 5.233 - Mandated use of HUD's Enterprise Income Verification (EIV) System.

    Science.gov (United States)

    2010-04-01

    ...) Project-based Voucher program under 24 CFR part 983; (v) Project-based Section 8 programs under 24 CFR... noncompliance. Failure to use the EIV system in its entirety may result in the imposition of sanctions and/or... Income Verification (EIV) System. 5.233 Section 5.233 Housing and Urban Development Office of...

  5. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    DEFF Research Database (Denmark)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  6. Methods for identification and verification using vacuum XRF system

    Science.gov (United States)

    Schramm, Fred (Inventor); Kaiser, Bruce (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  7. Safety verification of a fault tolerant reconfigurable autonomous goal-based robotic control system

    OpenAIRE

    Braman, Julia M. B.; Murray, Richard M.; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbo...

  8. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  9. PROGRESS OF THE AVNG SYSTEM - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    International Nuclear Information System (INIS)

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency (at) 1332 keV 50%) and digital gamma-ray spectrometer DSPECPLUS. The neutron multiplicity counter is a three ring counter with 164 3He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs

  10. Clinical implementation of enhanced dynamic wedges into the Pinnacle treatment planning system: Monte Carlo validation and patient-specific QA

    Science.gov (United States)

    Ahmad, Munir; Deng, Jun; Lund, Molly W.; Chen, Zhe; Kimmett, James; Moran, Meena S.; Nath, Ravinder

    2009-01-01

    The goal of this work is to present a systematic Monte Carlo validation study on the clinical implementation of the enhanced dynamic wedges (EDWs) into the Pinnacle3 (Philips Medical Systems, Fitchburg, WI) treatment planning system (TPS) and QA procedures for patient plan verification treated with EDWs. Modeling of EDW beams in the Pinnacle3 TPS, which employs a collapsed-cone convolution superposition (CCCS) dose model, was based on a combination of measured open-beam data and the 'Golden Segmented Treatment Table' (GSTT) provided by Varian for each photon beam energy. To validate EDW models, dose profiles of 6 and 10 MV photon beams from a Clinac 2100 C/D were measured in virtual water at depths from near-surface to 30 cm for a wide range of field sizes and wedge angles using the Profiler 2 (Sun Nuclear Corporation, Melbourne, FL) diode array system. The EDW output factors (EDWOFs) for square fields from 4 to 20 cm wide were measured in virtual water using a small-volume Farmer-type ionization chamber placed at a depth of 10 cm on the central axis. Furthermore, the 6 and 10 MV photon beams emerging from the treatment head of Clinac 2100 C/D were fully modeled and the central-axis depth doses, the off-axis dose profiles and the output factors in water for open and dynamically wedged fields were calculated using the Monte Carlo (MC) package EGS4. Our results have shown that (1) both the central-axis depth doses and the off-axis dose profiles of various EDWs computed with the CCCS dose model and MC simulations showed good agreement with the measurements to within 2%/2 mm; (2) measured EDWOFs used for monitor-unit calculation in Pinnacle3 TPS agreed well with the CCCS and MC predictions within 2%; (3) all the EDW fields satisfied our validation criteria of 1% relative dose difference and 2 mm distance-to-agreement (DTA) with 99-100% passing rate in routine patient treatment plan verification using MapCheck 2D diode array.

  11. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2009-01-01

    elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting in...... a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for...... automated construction and verification of railway control systems....

  12. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  13. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 2. Assessment of MCNP Statistical Analysis of keff Eigenvalue Convergence with an Analytical Criticality Verification Test Set

    International Nuclear Information System (INIS)

    Monte Carlo simulations of nuclear criticality eigenvalue problems are often performed by general purpose radiation transport codes such as MCNP. MCNP performs detailed statistical analysis of the criticality calculation and provides feedback to the user with warning messages, tables, and graphs. The purpose of the analysis is to provide the user with sufficient information to assess spatial convergence of the eigenfunction and thus the validity of the criticality calculation. As a test of this statistical analysis package in MCNP, analytic criticality verification benchmark problems have been used for the first time to assess the performance of the criticality convergence tests in MCNP. The MCNP statistical analysis capability has been recently assessed using the 75 multigroup criticality verification analytic problem test set. MCNP was verified with these problems at the 10-4 to 10-5 statistical error level using 40 000 histories per cycle and 2000 active cycles. In all cases, the final boxed combined keff answer was given with the standard deviation and three confidence intervals that contained the analytic keff. To test the effectiveness of the statistical analysis checks in identifying poor eigenfunction convergence, ten problems from the test set were deliberately run incorrectly using 1000 histories per cycle, 200 active cycles, and 10 inactive cycles. Six problems with large dominance ratios were chosen from the test set because they do not achieve the normal spatial mode in the beginning of the calculation. To further stress the convergence tests, these problems were also started with an initial fission source point 1 cm from the boundary thus increasing the likelihood of a poorly converged initial fission source distribution. The final combined keff confidence intervals for these deliberately ill-posed problems did not include the analytic keff value. In no case did a bad confidence interval go undetected. Warning messages were given signaling that the

  14. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed ...... are verified for the SystemC model by means of bounded model checking. (2) The object code is verified to be I/O behaviourally equivalent to the SystemC model from which it was compiled....

  15. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  16. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER MANAGEMENT STORMFILTER® TREATMENT SYSTEM USING PERLITE MEDIA

    Science.gov (United States)

    Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...

  18. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    NARCIS (Netherlands)

    Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 R

  19. Potential applications of neural networks to verification and validation of complex systems

    International Nuclear Information System (INIS)

    This paper presents conceptual methodology for the verification and validation of complex and integrated human-machine systems and in this context introduces related potential application of an artificial intelligence based information processing paradigm known as artificial neural networks. (author). 6 refs., 1 fig

  20. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    Science.gov (United States)

    2013-01-25

    ... foreign country of origin and comply with all other provisions of the Act and regulations that apply to U... contingency plans in the country for containing and mitigating the effects of food safety emergencies; the... verifications of the regulatory systems of countries that export meat, poultry, or processed egg products to...

  1. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    Energy Technology Data Exchange (ETDEWEB)

    ERMI, A.M.

    2000-09-05

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (V&V) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification.

  2. Verification of COMDES-II Systems Using UPPAAL with Model Transformation

    DEFF Research Database (Denmark)

    Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof;

    2008-01-01

    in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...... a set of desired requirements with the preservation of system original operation semantics....

  3. A new distribution scheme of decryption keys used in optical verification system with multiple-wavelength information

    Institute of Scientific and Technical Information of China (English)

    Niu Chun-Hui; Zhang Yan; Gu Ben-Yuan

    2005-01-01

    A new distribution scheme of decryption keys used in optical verification systems is proposed. The encryption procedure is digitally implemented with the use of an iteration algorithm in computer. Three target images corresponding to three wavelengths are encoded into three sets of phase-only masks (POMs) by a special distributing method. These three sets of POMs are assigned to three authorized users as the personal identification. A lensless optical system is used as the verification system. In the verification procedure, every two of the three authorized users can pass the verification procedure cooperatively, but only one user cannot do. Numerical simulation shows that the proposed distribution scheme of decryption keys not only can improve the security level of verification system, but also can bring convenience and flexibility for authorized users.

  4. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)

    2015-05-15

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.

  5. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    International Nuclear Information System (INIS)

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations

  6. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  7. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z; Thomas, A; Newton, J; Ibbott, G; Deasy, J; Oldham, M, E-mail: Zhiheng.wang@duke.ed

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm{sup 3}. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  8. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    Science.gov (United States)

    Wang, Z.; Thomas, A.; Newton, J.; Ibbott, G.; Deasy, J.; Oldham, M.

    2010-11-01

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  9. Dose Verification of Stereotactic Radiosurgery Treatment for Trigeminal Neuralgia with Presage 3D Dosimetry System

    International Nuclear Information System (INIS)

    Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.

  10. Advances in SVM-Based System Using GMM Super Vectors for Text-Independent Speaker Verification

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jian; DONG Yuan; ZHAO Xianyu; YANG Hao; LU Liang; WANG Haila

    2008-01-01

    For text-independent speaker verification,the Gaussian mixture model (GMM) using a universal background model strategy and the GMM using support vector machines are the two most commonly used methodologies.Recently,a new SVM-based speaker verification method using GMM super vectors has been proposed.This paper describes the construction of a new speaker verification system and investigates the use of nuisance attribute projection and test normalization to further enhance performance.Experiments were conducted on the core test of the 2006 NIST speaker recognition evaluation corpus.The experimental results indicate that an SVM-based speaker verification system using GMM super vectors can achieve ap-pealing performance.With the use of nuisance attribute projection and test normalization,the system per-formance can be significantly improved,with improvements in the equal error rate from 7.78% to 4.92% and detection cost function from 0.0376 to 0.0251.

  11. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  12. Design and verification of Guidance, Navigation and Control systems for space applications

    OpenAIRE

    Stesina, Fabrizio

    2014-01-01

    In the last decades, systems have strongly increased their complexity in terms of number of functions that can be performed and quantity of relationships between functions and hardware as well as interactions of elements and disciplines concurring to the definition of the system. The growing complexity remarks the importance of defining methods and tools that improve the design, verification and validation of the system process: effectiveness and costs reduction without loss of confidence in ...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT; UV DISINFECTION FOR REUSE APPLICATION, AQUIONICS, INC. BERSONINLINE 4250 UV SYSTEM

    Science.gov (United States)

    Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: UV DISINFECTION FOR REUSE APPLICATIONS, ONDEO DEGREMONT, INC., AQUARAY® 40 HO VLS DISINFECTION SYSTEM

    Science.gov (United States)

    Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...

  15. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  16. Simulation-based design process for the verification of ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, Romain, E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, FI-33101 Tampere (Finland); Määttä, Timo; Siuko, Mikko [VTT Technical Research Centre of Finland, P.O. Box 1300, FI-33101 Tampere (Finland); Mattila, Jouni [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland)

    2014-10-15

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability.

  17. Monte Carlo-derived TLD cross-calibration factors for treatment verification and measurement of skin dose in accelerated partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Garnica-Garza, H M [Centro de Investigacion y de Estudios Avanzados del Instituto Politecnico Nacional Unidad Monterrey, VIa del Conocimiento 201 Parque de Investigacion e Innovacion Tecnologica, Apodaca NL C.P. 66600 (Mexico)], E-mail: hgarnica@cinvestav.mx

    2009-03-21

    Monte Carlo simulation was employed to calculate the response of TLD-100 chips under irradiation conditions such as those found during accelerated partial breast irradiation with the MammoSite radiation therapy system. The absorbed dose versus radius in the last 0.5 cm of the treated volume was also calculated, employing a resolution of 20 {mu}m, and a function that fits the observed data was determined. Several clinically relevant irradiation conditions were simulated for different combinations of balloon size, balloon-to-surface distance and contents of the contrast solution used to fill the balloon. The thermoluminescent dosemeter (TLD) cross-calibration factors were derived assuming that the calibration of the dosemeters was carried out using a Cobalt 60 beam, and in such a way that they provide a set of parameters that reproduce the function that describes the behavior of the absorbed dose versus radius curve. Such factors may also prove to be useful for those standardized laboratories that provide postal dosimetry services.

  18. Odd-particle systems in the shell model Monte Carlo: circumventing a sign problem

    CERN Document Server

    Mukherjee, Abhishek

    2012-01-01

    We introduce a novel method within the shell model Monte Carlo approach to calculate the ground-state energy of a finite-size system with an odd number of particles by using the asymptotic behavior of the imaginary-time single-particle Green's functions. The method circumvents the sign problem that originates from the projection on an odd number of particles and has hampered direct application of the shell model Monte Carlo method to odd-particle systems. We apply this method to calculate pairing gaps of nuclei in the iron region. Our results are in good agreement with experimental pairing gaps.

  19. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre;

    2010-01-01

    as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...

  20. Quasi-Monte Carlo methods for lattice systems. A first look

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Leovey, H.; Griewank, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Mathematik; Nube, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Mueller-Preussker, M. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2013-02-15

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N{sup -1/2}, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N{sup -1}. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  1. Measurement and verification of load shifting interventions for a fridge plant system in South Africa

    OpenAIRE

    Gouws, Rupert

    2013-01-01

    In this paper, the author presents the measurement and verification methodology used to quantify the impacts of load shifting measures that are implemented on large industrial fridge plant systems in South Africa. A summary on the operation of fridge plant systems and the data typically available for baseline development is provided. The author discusses issues surrounding baseline development and service level adjustments for the following two scenarios: 1) the electrical data is available f...

  2. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  3. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  4. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    International Nuclear Information System (INIS)

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  5. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  6. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  7. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality, m

  8. Renewable Energy Certificate (REC) Tracking Systems: Costs & Verification Issues (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.

    2013-10-01

    This document provides information on REC tracking systems: how they are used in the voluntary REC market, a comparison of REC systems fees and information regarding how they treat environmental attributes.

  9. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system to...

  10. Risk Analysis of Tilapia Recirculating Aquaculture Systems: A Monte Carlo Simulation Approach

    OpenAIRE

    Kodra, Bledar

    2007-01-01

    Risk Analysis of Tilapia Recirculating Aquaculture Systems: A Monte Carlo Simulation Approach Bledar Kodra (ABSTRACT) The purpose of this study is to modify an existing static analytical model developed for a Re-circulating Aquaculture Systems through incorporation of risk considerations to evaluate the economic viability of the system. In addition the objective of this analysis is to provide a well documented risk based analytical system so that individuals (investors/lenders) c...

  11. Monte Carlo study of a high-sensitivity gamma-ray detection system

    International Nuclear Information System (INIS)

    The authors use Monte Carlo calculations to study a new design for a high-sensitivity gamma-ray detection system. The system uses an array of high-purity germanium detectors operating with an event-mode data acquisition system. The calculations show that the proposed design could produce a factor of 10 increase in the sensitivity of these measurements compared to currently employed systems

  12. Systematic study of finite-size effects in quantum Monte Carlo calculations of real metallic systems

    Energy Technology Data Exchange (ETDEWEB)

    Azadi, Sam, E-mail: s.azadi@imperial.ac.uk; Foulkes, W. M. C. [Department of Physics, Imperial College London, Exhibition Road, London SW7 2AZ (United Kingdom)

    2015-09-14

    We present a systematic and comprehensive study of finite-size effects in diffusion quantum Monte Carlo calculations of metals. Several previously introduced schemes for correcting finite-size errors are compared for accuracy and efficiency, and practical improvements are introduced. In particular, we test a simple but efficient method of finite-size correction based on an accurate combination of twist averaging and density functional theory. Our diffusion quantum Monte Carlo results for lithium and aluminum, as examples of metallic systems, demonstrate excellent agreement between all of the approaches considered.

  13. Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border

    Science.gov (United States)

    Rijsdijk, M.; van Hinsbergh, W. H. M.; Witteveen, W.; ten Buuren, G. H. M.; Schakelaar, G. A.; Poppinga, G.; van Persie, M.; Ladiges, R.

    2013-08-01

    Quite often in the verification of cadastral borders, owners of the parcels involved are not able to make their attendance at the appointed moment in time. New appointments have to be made in order to complete the verification process, and as a result often costs and throughput times grow beyond what is considered to be acceptable. To improve the efficiency of the verification process an experiment was set up that refrains from the conventional terrestrial methods for border verification. The central research question was formulated as "How useful are Unmanned Aerial Systems in the juridical verification process of cadastral borders of ownership at het Kadaster in the Netherlands?". For the experiment, operational evaluations were executed at two different locations. The first operational evaluation took place at the Pyramid of Austerlitz, a flat area with a 30 m high pyramid built by troops of Napoleon, with low civilian attendance. Two subsequent evaluations were situated in a small neighbourhood in the city of Nunspeet, where the cadastral situation recently changed, resulting from twenty new houses that were build. Initially a mini-UAS of the KLPD was used to collect photo datasets with less than 1 cm spatial resolution. In a later stage the commercial service provider Orbit Gis was hired. During the experiment four different software packages were used for processing the photo datasets into accurate geo-referenced ortho-mosaics. In this article more details will be described on the experiments carried out. Attention will be paid to the mini-UAS platforms (AscTec Falcon 8, Microdrone MD-4), the cameras used, the photo collection plan, the usage of ground control markers and the calibration of the camera's. Furthermore the results and experiences of the different used SFM software packages (Visual SFM/Bundler, PhotoScan, PhotoModeler and the Orbit software) will be shared.

  14. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  15. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  16. Construction and Verification of a Simple Smooth Chaotic System

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This article introduces a new chaotic system of three-dimensional quadratic autonomous ordinary differential equations, which can display different attractors with two unstable equilibrium points and four unstable equilibrium points respectively. Dynamical properties of this system are then studied. Furthermore, by applying the undetermined coefficient method, heteroclinic orbit of (S)hil'nikov's type in this system is found and the convergence of the series expansions of this heteroclinic orbit are proved in this article. The (S)hil'nikov's theorem guarantees that this system has Smale horseshoes and the horseshoe chaos.

  17. Technical and programmatic constraints in dynamic verification of satellite mechanical systems

    Science.gov (United States)

    Stavrinidis, C.; Klein, M.; Brunner, O.; Newerla, A.

    1996-01-01

    The development and verification of satellite systems covers various programmatic options. In the mechanical systems area, spacecraft test verification options include static, shaker vibration, modal survey, thermoelastic, acoustic, impact and other environmental tests. Development and verification tests influence the provision of satellite hardware, e.g. the structural model, engineering model, flight model, postflight etc., which need to be adopted by projects. In particular, adequate understanding of the satellite dynamic characteristics is essential for flight acceptance by launcher authorities. In general, a satellite shaker vibration test is requested by launcher authorities for expendable launchers. For the latter the launcher/satellite interface is well defined at the launcher clampband/separation device, and the interface is considered conveniently as a single point at the centre of the clampband. Recently the need has been identified to refine the interface idealization in launcher/satellite coupled loads dynamic analysis, particularly in cases where concentrated satellite loads are introduced at the interface, e.g. platform support struts. In the case of shuttle payloads, which are attached directly to the shuttle, shaker vibration at a single interface is not meaningful. Shuttle launcher authorities require identification of the satellite dynamic characteristics, e.g. by modal survey, and structural verification can be demonstrated by analysis, testing or a combination of analysis and testing. In the case of large satellite systems, which cannot be tested due to the limitation of the vibration shaker test facilities, a similar approach can be adapted for expendable launchers. In such an approach the dynamic characteristics of the satellite system will be identified by the modal survey test, and detailed satellite verification/qualification will be accomplished by analysis supported by subsystem and component level tests. Mechanical strength verification

  18. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  19. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2016-01-01

    safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  20. Sampling and verification methods for the uncertainty analysis of NDA and NDE waste characterization systems

    International Nuclear Information System (INIS)

    Use of nondestructive assay (NDA) and evaluation (NDE) systems in critical waste characterization requires a realistic assessment of the uncertainty in the measurements. The stated uncertainty must include potential effects of a variety of complicating external factors on the expected bias and precision. These factors include material heterogeneity (matrix effects), fluctuating background levels, and other variable operating conditions. Uncertainty figures from application of error propagation methods to data from controlled laboratory experiments using standard test materials can grossly underestimate the expected error. This paper reviews the standard error propagation method of uncertainty analysis, discusses some of its limitations, and presents an alternative approach based on sampling and verification. Examples of application of sampling and verification methods to measurement systems at INEL are described

  1. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  2. PROGRESS OF THE AVNG SYSTEM - ATTRIBUTE VERIFICATION SYSTEM WITH INFORMATION BARRIERS FOR MASS AND ISOTOPICS MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Budnikov, D; Bulatov, M; Jarikhine, I; Lebedev, B; Livke, A; Modenov, A; Morkin, A; Razinkov, S; Safronov, S; Tsaregorodtsev, D; Vlokh, A; Yakovleva, S; Elmont, T; Langner, D; MacArthur, D; Mayo, D; Smith, M; Luke, S J

    2005-05-27

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency {at} 1332 keV 50%) and digital gamma-ray spectrometer DSPEC{sup PLUS}. The neutron multiplicity counter is a three ring counter with 164 {sup 3}He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  3. Progress of the AVNG System - Attribute Verification System with Information Barriers for Mass Isotopics Measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Budnikov, D. (Dmitry); Bulatov, M. (Mikhail); Jarikhine, I. (Igor); Lebedev, B. (Boris); Livke, A. (Alexander); Modenov, A.; Morkin, A. (Anton); Razinkov, S. (Sergei); Tsaregorodtsev, D. (Dmitry); Vlokh, A. (Andrey); Yakovleva, S. (Svetlana); Elmont, T. H. (Timothy H.); Langner, D. C. (Diana C.); MacArthur, D. W. (Duncan W.); Mayo, D. R. (Douglas R.); Smith, M. K. (Morag K.); Luke, S. J. (S. John)

    2005-01-01

    An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPEC{sup PLUS}. The neutron multiplicity counter is a three ring counter with 164 {sup 3}He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.

  4. VERIFICATION OF TORSIONAL OSCILLATING MECHANICAL SYSTEM DYNAMIC CALCULATION RESULTS

    Directory of Open Access Journals (Sweden)

    Peter KAŠŠAY

    2014-09-01

    Full Text Available On our department we deal with optimization and tuning of torsional oscillating mechanical systems. When solving these problems we often use results of dynamic calculation. The goal of this article is to compare values obtained by computation and experimentally. For this purpose, a mechanical system built in our laboratory was used. At first, classical HARDY type flexible coupling has been applied into the system, then we used a pneumatic flexible shaft coupling developed by us. The main difference of these couplings over conventional flexible couplings is that they can change their dynamic properties during operation, by changing the pressure of the gaseous medium in their flexible elements.

  5. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    OpenAIRE

    Joseph, S.; Herold, M; Sunderlin, W.D.; L. V. Verchot

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three ca...

  6. Quantum Mechanics and locality in the K0 K-bar0 system experimental verification possibilities

    International Nuclear Information System (INIS)

    It is shown that elementary Quantum Mechanics, applied to the K0 K-bar0 system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K0 K-bar0 experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs

  7. Revisiting the security of speaker verification systems against imposture using synthetic speech

    OpenAIRE

    De Leon, P.L.; Apsingekar, V. R.; Pucher, M.; Yamagishi, J

    2010-01-01

    In this paper, we investigate imposture using synthetic speech. Although this problem was first examined over a decade ago, dramatic improvements in both speaker verification (SV) and speech synthesis have renewed interest in this problem. We use a HMM-based speech synthesizer which creates synthetic speech for a targeted speaker through adaptation of a background model. We use two SV systems: standard GMMUBM- based and a newer SVM-based. Our results show when the syst...

  8. A Verification and Validation Tool for Diagnostic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  9. ASSIMILATION SYSTEM AT DHMZ: DEVELOPMENT AND FIRST VERIFICATION RESULTS

    OpenAIRE

    Stanešić, Antonio

    2011-01-01

    Abstract: In this paper, a description of the setup for a local assimilation system for a limited area model, ALADIN (Aire Limiteé Adaptation Dynamique dévelopement InterNational), is given with a comprehensive description of the assimilation techniques used. The assimilation system at DHMZ (Meteorological and Hydrological Service of Croatia) consisted of two parts: the surface assimilation, which was used to change the state of a model land surface variables, and the upper air assimilatio...

  10. Verification of uranium 238 quantity calculated using waste assay systems

    International Nuclear Information System (INIS)

    The amount of 238U in uranium-contaminated waste drums generated in the decommissioning of nuclear facilities is evaluated from γ-ray measurement. We used the γ-ray measurement system made from CANBERRA(Qualitative and Quantitative (Q2) Low Level Waste Assay Systems) and measured the waste drums. This system assumes uniform distribution of uranium. But, homogeneity can not be checked with real waste drums. Authors developed the new analysis technique which calculates the amount of uranium by correcting the influence of uneven distribution of the uranium. As a result of evaluating using the new analysis technique, the error which influences quantitative value of 238U has been evaluated. (author)

  11. Refinement and Verification of Real-Time Systems

    CERN Document Server

    Kolano, Paul Z; Kemmerer, Richard A; Mandrioli, Dino

    2010-01-01

    This paper discusses highly general mechanisms for specifying the refinement of a real-time system as a collection of lower level parallel components that preserve the timing and functional requirements of the upper level specification. These mechanisms are discussed in the context of ASTRAL, which is a formal specification language for real-time systems. Refinement is accomplished by mapping all of the elements of an upper level specification into lower level elements that may be split among several parallel components. In addition, actions that can occur in the upper level are mapped to actions of components operating at the lower level. This allows several types of implementation strategies to be specified in a natural way, while the price for generality (in terms of complexity) is paid only when necessary. The refinement mechanisms are first illustrated using a simple digital circuit; then, through a highly complex phone system; finally, design guidelines gleaned from these specifications are presented.

  12. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  13. The CMS Monte Carlo Production System: Development and Design

    Energy Technology Data Exchange (ETDEWEB)

    Evans, D. [Fermi National Accelerator Laboratory, Batavia, IL (United States)], E-mail: evansde@fnal.gov; Fanfani, A. [Universita degli Studi di Bologna and INFN Sezione di Bologna, Bologna (Italy); Kavka, C. [INFN Sezione di Trieste, Trieste (Italy); Lingen, F. van [California Institute of Technology, Pasadena, CA (United States); Eulisse, G. [Northeastern University, Boston, MA (United States); Bacchi, W.; Codispoti, G. [Universita degli Studi di Bologna and INFN Sezione di Bologna, Bologna (Italy); Mason, D. [Fermi National Accelerator Laboratory, Batavia, IL (United States); De Filippis, N. [INFN Sezione di Bari, Bari (Italy); Hernandez, J.M. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Madrid (Spain); Elmer, P. [Princeton University, Princeton, NJ (United States)

    2008-03-15

    The CMS production system has undergone a major architectural upgrade from its predecessor, with the goal of reducing the operational manpower needed and preparing for the large scale production required by the CMS physics plan. The new production system is a tiered architecture that facilitates robust and distributed production request processing and takes advantage of the multiple Grid and farm resources available to the CMS experiment.

  14. The CMS Monte Carlo Production System Development and Design

    CERN Document Server

    Evans, D; Kavka, C; Van Lingen, F; Eulisse, G; Bacchi, W; Codispoti, G; Mason, D; De Filippis, N; Hernandez J M; Elmer, P

    2008-01-01

    The CMS production system has undergone a major architectural upgrade from its predecessor, with the goal of reducing the operational manpower needed and preparing for the large scale production required by the CMS physics plan. The new production system is a tiered architecture that facilitates robust and distributed production request processing and takes advantage of the multiple Grid and farm resources available to the CMS experiment.

  15. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  16. Structural Dynamics Verification of Rotorcraft Comprehensive Analysis System (RCAS)

    Energy Technology Data Exchange (ETDEWEB)

    Bir, G. S.

    2005-02-01

    The Rotorcraft Comprehensive Analysis System (RCAS) was acquired and evaluated as part of an ongoing effort by the U.S Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to provide state-of-the-art wind turbine modeling and analysis technology for Government and industry. RCAS is an interdisciplinary tool offering aeroelastic modeling and analysis options not supported by current codes. RCAS was developed during a 4-year joint effort among the U.S. Army's Aeroflightdynamics Directorate, Advanced Rotorcraft Technology Inc., and the helicopter industry. The code draws heavily from its predecessor 2GCHAS (Second Generation Comprehensive Helicopter Analysis System), which required an additional 14 years to develop. Though developed for the rotorcraft industry, its general-purpose features allow it to model or analyze a general dynamic system. Its key feature is a specialized finite element that can model spinning flexible parts. The code, therefore, appears particularly suited for wind turbines whose dynamics is dominated by massive flexible spinning rotors. In addition to the simulation capability of the existing codes, RCAS [1-3] offers a range of unique capabilities, including aeroelastic stability analysis, trim, state-space modeling, operating modes, modal reduction, multi-blade coordinate transformation, periodic-system-specific analysis, choice of aerodynamic models, and a controls design/implementation graphical interface.

  17. Design, development and verification of the HIFI Alignment Camera System

    NARCIS (Netherlands)

    Boslooper, E.C.; Zwan, B.A. van der; Kruizinga, B.; Lansbergen, R.

    2005-01-01

    This paper presents the TNO share of the development of the HIFI Alignment Camera System (HACS), covering the opto-mechanical and thermal design. The HACS is an Optical Ground Support Equipment (OGSE) that is specifically developed to verify proper alignment of different modules of the HIFI instrume

  18. First verification of generic fidelity recovery in a dynamical system

    CERN Document Server

    Pineda, C; Schäfer, R; Seligman, T H; Pineda, Carlos; Prosen, Tomaz; Schaefer, Rudi; Seligman, Thomas H.

    2006-01-01

    We study the time evolution of fidelity in a dynamical many body system, namely a kicked Ising model, modified to allow for a time reversal invariance breaking. We find good agreement with the random matrix predictions in the realm of strong perturbations. In particular for the time-reversal symmetry breaking case the predicted revival at Heisenberg time is clearly seen.

  19. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....

  20. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  1. Nondestructive verification and assay systems for spent fuels

    International Nuclear Information System (INIS)

    This is an interim report of a study concerning the potential application of nondestructive measurements on irradiated light-water-reactor (LWR) fuels at spent-fuel storage facilities. It describes nondestructive measurement techniques and instruments that can provide useful data for more effective in-plant nuclear materials management, better safeguards and criticality safety, and more efficient storage of spent LWR fuel. In particular, several nondestructive measurement devices are already available so that utilities can implement new fuel-management and storage technologies for better use of existing spent-fuel storage capacity. The design of an engineered prototype in-plant spent-fuel measurement system is approx. 80% complete. This system would support improved spent-fuel storage and also efficient fissile recovery if spent-fuel reprocessing becomes a reality

  2. Simulated coal gas MCFC power plant system verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  3. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    based on the Stålmarck algorithm. While some requirements are easily proved, others are virtually impossible to manage du to a very large potenbtial state space. We present what has been done in order to get, at least, an idea of whether or not such difficult requirements are fulfilled or not, and we...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....

  4. Integrated testing and verification system for research flight software design document

    Science.gov (United States)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  5. The SAMS: Smartphone Addiction Management System and verification.

    Science.gov (United States)

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  6. Dosimetric verification of a commercial inverse treatment planning system

    Science.gov (United States)

    Xing, Lei; Curran, Bruce; Hill, Robert; Holmes, Tim; Ma, Lijun; Forster, Kenneth M.; Boyer, Arthur L.

    1999-02-01

    A commercial three-dimensional (3D) inverse treatment planning system, Corvusimages/0031-9155/44/2/013/img10.gif" ALIGN="TOP"/> (Nomos Corporation, Sewickley, PA), was recently made available. This paper reports our preliminary results and experience with commissioning this system for clinical implementation. This system uses a simulated annealing inverse planning algorithm to calculate intensity-modulated fields. The intensity-modulated fields are divided into beam profiles that can be delivered by means of a sequence of leaf settings by a multileaf collimator (MLC). The treatments are delivered using a computer-controlled MLC. To test the dose calculation algorithm used by the Corvus software, the dose distributions for single rectangularly shaped fields were compared with water phantom scan data. The dose distributions predicted to be delivered by multiple fields were measured using an ion chamber that could be positioned in a rotatable cylindrical water phantom. Integrated charge collected by the ion chamber was used to check the absolute dose of single- and multifield intensity modulated treatments at various spatial points. The measured and predicted doses were found to agree to within 4% at all measurement points. Another set of measurements used a cubic polystyrene phantom with radiographic film to record the radiation dose distribution. The films were calibrated and scanned to yield two-dimensional isodose distributions. Finally, a beam imaging system (BIS) was used to measure the intensity-modulated x-ray beam patterns in the beam's-eye view. The BIS-measured images were then compared with a theoretical calculation based on the MLC leaf sequence files to verify that the treatment would be executed accurately and without machine faults. Excellent correlation (correlation coefficients images/0031-9155/44/2/013/img11.gif" ALIGN="TOP"/>) was found for all cases. Treatment plans generated using intensity-modulated beams appear to be suitable for treatment of

  7. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  8. Monte Carlo analysis of the accelerator-driven system at Kyoto University Research Reactor Institute

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Kyeong; Lee, Deok Jung [Nuclear Engineering Division, Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Hyun Chul [VHTR Technology Development Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Pyeon, Cheol Ho [Nuclear Engineering Science Division, Kyoto University Research Reactor Institute, Osaka (Japan); Shin, Ho Cheol [Core and Fuel Analysis Group, Korea Hydro and Nuclear Power Central Research Institute, Daejeon (Korea, Republic of)

    2016-04-15

    An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan), a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft-Walton type accelerator, which generates the external neutron source by deuterium-tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.

  9. HERMES - a Monte Carlo program system for beam-materials interaction studies

    International Nuclear Information System (INIS)

    HERMES (High Energy Radiation Monte Carlo Elaborate System) is a system of Monte-Carlo computer codes that are necessary to treat the different physics to be considered in computer simulation of radiation transport and interaction problems. The HERMES collection of physics programs permits the simulation of secondary particle histories induced by primary particles of any energy up to the regime of high-energy physics and down to thermal energies, e.g. for neutrons. The particles, that are considered by the programs of the HERMES system are p, n, π+, π-, π0, π±, e+, e-, γ, and light ions to A=10. The programs of the HERMES system have been taken as original codes as far as possible. To satisfy the needs of some applications, extensions and changes became necessary. Also the interfacing technique by HERMES submission files needs some additional programming. All changes made to the original codes are documented. (orig./DG)

  10. Systems guide to MCNP (Monte Carlo Neutron and Photon Transport Code)

    International Nuclear Information System (INIS)

    The subject of this report is the implementation of the Los Alamos National Laboratory Monte Carlo Neutron and Photon Transport Code - Version 3 (MCNP) on the different types of computer systems, especially the IBM MVS system. The report supplements the documentation of the RSIC computer code package CCC-200/MCNP. Details of the procedure to follow in executing MCNP on the IBM computers, either in batch mode or interactive mode, are provided

  11. Exact Safety Verification of Hybrid Systems Using Sums-Of-Squares Representation

    CERN Document Server

    Lin, Wang; Yang, Zhengfeng; Zeng, Zhenbing

    2011-01-01

    In this paper we discuss how to generate inductive invariants for safety verification of hybrid systems. A hybrid symbolic-numeric method is presented to compute inequality inductive invariants of the given systems. A numerical invariant of the given system can be obtained by solving a parameterized polynomial optimization problem via sum-of-squares (SOS) relaxation. And a method based on Gauss-Newton refinement and rational vector recovery is deployed to obtain the invariants with rational coefficients, which exactly satisfy the conditions of invariants. Several examples are given to illustrate our algorithm.

  12. MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Gabriela Ižaríková

    2015-12-01

    Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.

  13. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  14. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  15. Robust control design verification using the modular modeling system

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem.

  16. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  17. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  18. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    Science.gov (United States)

    Mowlavi, Ali Asghar; Hadizadeh Yazdi, Mohammad Hadi

    2011-12-01

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with 241Am-9Be source and NaI(Tl) detector to obtain the distortion due to “pile-up” in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  19. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    Energy Technology Data Exchange (ETDEWEB)

    Mowlavi, Ali Asghar, E-mail: amowlavi@sttu.ac.ir [Physics Department, School of Sciences, Sabzevar Tarbiat Moallem University, Sabzevar (Iran, Islamic Republic of); TRIL, ICTP, Trieste (Italy); Hadizadeh Yazdi, Mohammad Hadi [Physics Department, School of Sciences, Ferdowsi University of Mashhad, Mashhad (Iran, Islamic Republic of)

    2011-12-21

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with {sup 241}Am-{sup 9}Be source and NaI(Tl) detector to obtain the distortion due to 'pile-up' in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  20. Monte Carlo simulation of pulse pile-up effect in gamma spectrum of a PGNAA system

    International Nuclear Information System (INIS)

    We have applied a pile-up Monte Carlo simulation code on gamma spectrum of a prompt gamma neutron activation analysis (PGNAA) system. The code has been run in nonparalyzable mode for a specific geometry of a PGNAA system with 241Am-9Be source and NaI(Tl) detector to obtain the distortion due to “pile-up” in the pulse height of gamma spectrum. The results show that the main background in the nitrogen region of interest (ROI) is due to two pile-ups. We have also evaluated the variation of count rate and total photon sampling over the Monte Carlo spectra. At high count rates, not only the nitrogen ROI but also carbon ROI, and hydrogen peak are disturbed strongly. Comparison between the results of simulations and the experimental spectra has shown a good agreement. The code could be used for other source setups and different gamma detection systems.

  1. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  2. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  3. Verification of a probabilistic flood forecasting system for an Alpine Region of northern Italy

    Science.gov (United States)

    Laiolo, P.; Gabellani, S.; Rebora, N.; Rudari, R.; Ferraris, L.; Ratto, S.; Stevenin, H.

    2012-04-01

    Probabilistic hydrometeorological forecasting chains are increasingly becoming an operational tool used by civil protection centres for issuing flood alerts. One of the most important requests of decision makers is to have reliable systems, for this reason an accurate verification of their predictive performances become essential. The aim of this work is to validate a probabilistic flood forecasting system: Flood-PROOFS. The system works in real time, since 2008, in an alpine Region of northern Italy, Valle d'Aosta. It is used by the Civil Protection regional service to issue warnings and by the local water company to protect its facilities. Flood-PROOFS uses as input Quantitative Precipitation Forecast (QPF) derived from the Italian limited area model meteorological forecast (COSMO-I7) and forecasts issued by regional expert meteorologists. Furthermore the system manages and uses both real time meteorological and satellite data and real time data on the maneuvers performed by the water company on dams and river devices. The main outputs produced by the computational chain are deterministic and probabilistic discharge forecasts in different cross sections of the considered river network. The validation of the flood prediction system has been conducted on a 25 months period considering different statistical methods such as Brier score, Rank histograms and verification scores. The results highlight good performances of the system as support system for emitting warnings but there is a lack of statistics especially for huge discharge events.

  4. Verification of operational weather forecasts from the POSEIDON system across the Eastern Mediterranean

    Directory of Open Access Journals (Sweden)

    A. Papadopoulos

    2009-07-01

    Full Text Available The POSEIDON weather forecasting system became operational at the Hellenic Centre for Marine Research (HCMR in October 1999. The system with its nesting capability provided 72-h forecasts in two different model domains, i.e. 25- and 10-km grid spacing. The lower-resolution domain covered an extended area that included most of Europe, Mediterranean Sea and N. Africa, while the higher resolution domain focused on the Eastern Mediterranean. A major upgrade of the system was recently implemented in the framework of the POSEIDON-II project (2005–2008. The aim was to enhance the forecasting skill of the system through improved model parameterization schemes and advanced numerical techniques for assimilating available observations to produce high resolution analysis fields. The configuration of the new system is applied on a horizontal resolution of 1/20°×1/20° (~5 km covering the Mediterranean basin, Black Sea and part of North Atlantic providing up to 5-day forecasts. This paper reviews and compares the current with the previous weather forecasting systems at HCMR presenting quantitative verification statistics from the pre-operational period (from mid-November 2007 to October 2008. The statistics are based on verification against surface observations from the World Meteorological Organization (WMO network across the Eastern Mediterranean region. The results indicate that the use of the new system can significantly improve the weather forecasts.

  5. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  6. Research on database realization technology of seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Developing CTBT verification technology has become the most important method that makes sure CTBT to be fulfilled conscientiously. The seismic analysis based on seismic information system (SIS) is playing an important rule in this field. Based on GIS, the SIS will be very sufficient and powerful in spatial analysis, topologic analysis and visualization. However, the critical issue to implement the whole system function depends on the performance of SIS DB. Based on the ArcSDE Geodatabase data model, not only have the spatial data and attribute data seamless integrated management been realized with RDBMS ORACLE really, but also the most functions of ORACLE have been reserved. (authors)

  7. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  8. Verification of Monte Carlo transport codes against measured small angle p-, d-, and t-emission in carbon fragmentation at 600 MeV/nucleon

    Energy Technology Data Exchange (ETDEWEB)

    Abramov, B. M. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Alekseev, P. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Borodin, Yu. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Bulychjov, S. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Dukhovskoy, I. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Krutenkova, A. P. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Martemianov, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Matsyuk, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Turdakina, E. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Khanov, A. I. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Mashnik, Stepan Georgievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-03

    Momentum spectra of hydrogen isotopes have been measured at 3.5° from 12C fragmentation on a Be target. Momentum spectra cover both the region of fragmentation maximum and the cumulative region. Differential cross sections span five orders of magnitude. The data are compared to predictions of four Monte Carlo codes: QMD, LAQGSM, BC, and INCL++. There are large differences between the data and predictions of some models in the high momentum region. The INCL++ code gives the best and almost perfect description of the data.

  9. Evaluation of the material assignment method used by a Monte Carlo treatment planning system.

    Science.gov (United States)

    Isambert, A; Brualla, L; Lefkopoulos, D

    2009-12-01

    An evaluation of the conversion process from Hounsfield units (HU) to material composition in computerised tomography (CT) images, employed by the Monte Carlo based treatment planning system ISOgray (DOSIsoft), is presented. A boundary in the HU for the material conversion between "air" and "lung" materials was determined based on a study using 22 patients. The dosimetric consequence of the new boundary was quantitatively evaluated for a lung patient plan.

  10. Verification of Relational Data-Centric Dynamic Systems with External Services

    CERN Document Server

    Hariri, Babak Bagheri; De Giacomo, Giuseppe; Deutsch, Alin; Montali, Marco

    2012-01-01

    Data-centric dynamic systems are systems where both the process controlling the dynamics and the manipulation of data are equally central. In this paper we study verification of (first-order) mu-calculus variants over relational data-centric dynamic systems, where data are represented by a full-fledged relational database, and the process is described in terms of atomic actions that evolve the database. The execution of such actions may involve calls to external services, providing fresh data inserted into the system. As a result such systems are typically infinite-state. We show that verification is undecidable in general, and we isolate notable cases, where decidability is achieved. Specifically we start by considering service calls that return values deterministically (depending only on passed parameters). We show that in a mu-calculus variant that preserves knowledge of objects appeared along a run we get decidability under the assumption that the fresh data introduced along a run are bounded, though they...

  11. Assembly, integration, verification, and validation in extremely large telescope projects: a core systems engineering task

    Science.gov (United States)

    Ansorge, Wolfgang R.

    2000-08-01

    This presentation describes the verification and validation processes of an Extremely Large Telescope Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50%. The intention of this article is, by explaining the importance of Systems Engineering in the AIV and validation processes, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project.

  12. System Design and In-orbit Verification of the HJ-1-C SAR Satellite

    Directory of Open Access Journals (Sweden)

    Zhang Run-ning

    2014-06-01

    Full Text Available HJ-1-C is a SAR satellite owned by the Chinese Environment and Natural Disaster Monitoring constellation, and works together with the optical satellites HJ-1-A/B for monitoring environment and natural disasters. In this paper, the system design and characteristics of the first Chinese civil SAR satellite are described. In addition, the interface relation between SAR payload and platform is studied. Meanwhile, the data transmission capability, attitude, power, and temperature control that support SAR imaging are reviewed. Finally, the corresponding in-orbit verification results are presented.

  13. Specification of Selected Performance Monitoring and Commissioning Verification Algorithms for CHP Systems

    Energy Technology Data Exchange (ETDEWEB)

    Brambley, Michael R.; Katipamula, Srinivas

    2006-10-06

    Pacific Northwest National Laboratory (PNNL) is assisting the U.S. Department of Energy (DOE) Distributed Energy (DE) Program by developing advanced control algorithms that would lead to development of tools to enhance performance and reliability, and reduce emissions of distributed energy technologies, including combined heat and power technologies. This report documents phase 2 of the program, providing a detailed functional specification for algorithms for performance monitoring and commissioning verification, scheduled for development in FY 2006. The report identifies the systems for which algorithms will be developed, the specific functions of each algorithm, metrics which the algorithms will output, and inputs required by each algorithm.

  14. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  15. Comparing Subspace Methods for Closed Loop Subspace System Identification by Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    David Di Ruscio

    2009-10-01

    Full Text Available A novel promising bootstrap subspace system identification algorithm for both open and closed loop systems is presented. An outline of the SSARX algorithm by Jansson (2003 is given and a modified SSARX algorithm is presented. Some methods which are consistent for closed loop subspace system identification presented in the literature are discussed and compared to a recently published subspace algorithm which works for both open as well as for closed loop data, i.e., the DSR_e algorithm as well as the bootstrap method. Experimental comparisons are performed by Monte Carlo simulations.

  16. Improving the efficiency of Monte Carlo simulations of systems that undergo temperature-driven phase transitions

    Science.gov (United States)

    Velazquez, L.; Castro-Palacio, J. C.

    2013-07-01

    Recently, Velazquez and Curilef proposed a methodology to extend Monte Carlo algorithms based on a canonical ensemble which aims to overcome slow sampling problems associated with temperature-driven discontinuous phase transitions. We show in this work that Monte Carlo algorithms extended with this methodology also exhibit a remarkable efficiency near a critical point. Our study is performed for the particular case of a two-dimensional four-state Potts model on a square lattice with periodic boundary conditions. This analysis reveals that the extended version of Metropolis importance sampling is more efficient than the usual Swendsen-Wang and Wolff cluster algorithms. These results demonstrate the effectiveness of this methodology to improve the efficiency of MC simulations of systems that undergo any type of temperature-driven phase transition.

  17. Monte Carlo simulations of star clusters - II. Tidally limited, multi-mass systems with stellar evolution

    CERN Document Server

    Giersz, M

    2000-01-01

    A revision of Stod\\{'o}{\\l}kiewicz's Monte Carlo code is used to simulate evolution of large star clusters. A survey of the evolution of N-body systems influenced by the tidal field of a parent galaxy and by stellar evolution is presented. The results presented are in good agreement with theoretical expectations and the results of other methods (Fokker-Planck, Monte Carlo and N-body). The initial rapid mass loss, due to stellar evolution of the most massive stars, causes expansion of the whole cluster and eventually leads to the disruption of less bound systems ($W_0=3$). Models with larger $W_0$ survive this phase of evolution and then undergo core collapse and subsequent post-collapse expansion, like isolated models. The expansion phase is eventually reversed when tidal limitation becomes important. The results presented are the first major step in the direction of simulating evolution of real globular clusters by means of the Monte Carlo method.

  18. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  19. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    International Nuclear Information System (INIS)

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs

  20. RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems

    Science.gov (United States)

    Pecheur, Charles; Visser, Willem; Simmons, Reid

    2001-01-01

    The long-term future of space exploration at NASA is dependent on the full exploitation of autonomous and adaptive systems: careful monitoring of missions from earth, as is the norm now, will be infeasible due to the sheer number of proposed missions and the communication lag for deep-space missions. Mission managers are however worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries and hence we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation (V&V) of software systems. The dual purpose of the meeting was to: (1) make NASA engineers aware of the V&V techniques they could be using; and (2) make the V&V community aware of the complexity of the systems NASA is developing.

  1. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  2. Research and verification of Monte Carlo burnup calculations based on Chebyshev rational approximation method%基于切比雪夫有理逼近方法的蒙特卡罗燃耗计算研究与验证

    Institute of Scientific and Technical Information of China (English)

    范文玎; 孙光耀; 张彬航; 陈锐; 郝丽娟

    2016-01-01

    燃耗计算在反应堆设计、分析研究中起着重要作用.相比于传统点燃耗算法,切比雪夫有理逼近方法(Chebyshev rational approximation method,CRAM)具有计算速度快、精度高的优点.基于超级蒙特卡罗核计算仿真软件系统SuperMC(Super Monte Carlo Simulation Program for Nuclear and Radiation Process),采用切比雪夫有理逼近方法和桶排序能量查找方法,进行了蒙特卡罗燃耗计算的初步研究与验证.通过燃料棒燃耗例题以及IAEA-ADS(International Atomic Energy Agency-Accelerator Driven Systems)国际基准题,初步验证了该燃耗计算方法的正确性,且IAEA-ADS基准题测试表明,与统一能量网格方法相比,桶排序能量查找方法在保证了计算效率的同时减少了内存开销.%Background:Burnup calculation is the key point of reactor design and analysis. It's significant to calculate the burnup situation and isotopic atom density accurately while a reactor is being designed.Purpose:Based on the Monte Carlo particle simulation code SuperMC (Super Monte Carlo Simulation Program for Nuclear and Radiation Process), this paper aimed to conduct preliminary study and verification on Monte Carlo burnup calculations. Methods:For the characteristics of accuracy, this paper adopted Chebyshev rational approximation method (CRAM) as the point-burnup algorithm. Moreover, instead of the union energy grids method, this paper adopted an energy searching method based on bucket sort algorithm, which reduced the memory overhead on the condition that the calculation efficiency is ensured.Results:By calculating the fuel rod burnup problem and the IAEA-ADS (International Atomic Energy Agency - Accelerator Driven Systems) international benchmark, the simulation results were basically consistent with Serpent and other counties' results, respectively. In addition, the bucket sort energy searching method reduced about 95% storage space compared with union energy grids method for IAEA

  3. Verification of Monte Carlo Calculations by Means of Neutron and Gamma Fluence Spectra Measurements behind and inside of Iron-Water Configurations

    International Nuclear Information System (INIS)

    Neutron and gamma spectra were measured behind and inside of modules consisting of variable iron and water slabs that were installed in radial beams of the zero-power training and research reactors AKR of the Technical University Dresden and ZLFR of the University of Applied Sciences Zittau/Goerlitz. The applied NE-213 scintillation spectrometer did allow the measurement of gamma and neutron fluence spectra in the energy regions 0.3-10 MeV for photons and 1.0-20 MeV for neutrons. The paper describes the experiments and presents important results of the measurements. They are compared with the results of Monte Carlo transport calculations made by means of the codes MCNP and TRAMO on an absolute scale of fluences

  4. Development and preliminary verification of the PWR on-line core monitoring software system. SOPHORA

    International Nuclear Information System (INIS)

    This paper presents an introduction to the development and preliminary verification of a new on-line core monitoring software system (CMSS), named SOPHORA, for fixed in-core detector (FID) system of PWR. Developed at China General Nuclear Power Corporation (CGN), SOPHORA integrates CGN’s advanced PWR core simulator COCO and thermal-hydraulic sub-channel code LINDEN to manage the real-time core calculation and analysis. Currents measured by the FID are re-evaluated and used as bases to reconstruct the 3-D core power distribution. The key parameters such as peak local power margin and minimum DNBR margin are obtained by comparing with operation limits. Pseudo FID signals generated by data from movable in-core detector (MID) are used to verify the SOPHORA system. Comparison between predicted power peak and the responding MID in-core flux map results shows that the SOPHORA results are reasonable and satisfying. Further verification and validation of SOPHORA is undergoing and will be reported later. (author)

  5. The gyroscope testbed: A verification of the gravity probe B suspension system

    Science.gov (United States)

    Brumley, Robert Willard

    The verification of precision control systems for use in space-based applications can be extremely challenging. Often, the presence of the 1-g field substantively changes the control problem, making it impossible to test directly on the Earth. This talk discusses a new approach to testing and verification of the gyroscope suspension system for the Gravity Probe B (GP-B) experimental test of General Relativity. The verification approach involves the creation of a new testbed that has the same input-output characteristics and dynamics as a GP-B gyroscope. This involves real physical hardware that moves like a real gyroscope, allowing the suspension system's performance to be measured directly without the need to break any internal connections or bypass internal subsystems. The user free to define any set of disturbances from a 1-g ground levitation to a 10-8 g science mission. The testbed has two main subsystems. The mechanical subsystem is comprised of six parallel plate capacitors whose spacing is controlled by precision actuators. These actuators are the physical interface to the suspension system and create the electrode-rotor capacitances present in a real gyroscope. The closed-loop positioning noise of the system is approximately 10 pm/√Hz, enabling the commanding of position variations a fraction the size of a single atom of Silicon. The control subsystem has a DSP-based high-speed nonlinear controller that forces the actuators to follow the dynamics of a gyroscope. The device has been shown to faithfully represent a gyroscope in 1-g levitation, and a robustness analysis has been performed to prove that it correctly tests the stability of the on-orbit system. The testbed is then used to measure directly suspension system performance in a variety of on-orbit scenarios. Gyroscope levitation in 10-8 g conditions is demonstrated. The robustness of gyroscope levitation to transient disturbances such as micrometeorite impacts on the space vehicle and transitions

  6. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  7. Formal Modeling and Verification of Context-Aware Systems using Event-B

    Directory of Open Access Journals (Sweden)

    Hong Anh Le

    2014-12-01

    Full Text Available Context awareness is a computing paradigm that makes applications responsive and adaptive with their environment. Formal modeling and verification of context-aware systems are challenging issues in the development as they are complex and uncertain. In this paper, we propose an approach to use a formal method Event-B to model and verify such systems. First, we specify a context aware system’s components such as context data entities, context rules, context relations by Event-B notions. In the next step, we use the Rodin platform to verify the system’s desired properties such as context constraint preservation. It aims to benefit from natural representation of context awareness concepts in Event-B and proof obligations generated by refinement mechanism to ensure the correctness of systems. We illustrate the use of our approach on a scenario of an Adaptive Cruise Control system.

  8. IMRT verification with a camera-based electronic portal imaging system

    International Nuclear Information System (INIS)

    An evaluation of the capabilities of a commercially available camera-based electronic portal imaging system for intensity-modulated radiotherapy verification is presented. Two modifications to the system are demonstrated which use a novel method to tag each image acquired with the delivered dose measured by the linac monitor chamber and reduce optical cross-talk in the imager. A detailed performance assessment is presented, including measurements of the optical decay characteristics of the system. The overall geometric accuracy of the system is determined to be ±2.0 mm, with a dosimetric accuracy of ±1.25 MU. Finally a clinical breast IMRT treatment, delivered by dynamic multileaf collimation, is successfully verified both by tracking the position of each leaf during beam delivery and recording the integrated intensity observed over the entire beam. (author)

  9. Dose perturbation in the presence of metallic implants: treatment planning system versus Monte Carlo simulations

    Science.gov (United States)

    Wieslander, Elinore; Knöös, Tommy

    2003-10-01

    An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox® (a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants.

  10. River Protection Project Integrated safety management system phase II verification report, volumes I and II - 8/19/99

    Energy Technology Data Exchange (ETDEWEB)

    SHOOP, D.S.

    1999-09-10

    The Department of Energy policy (DOE P 450.4) is that safety is integrated into all aspects of the management and operations of its facilities. In simple and straightforward terms, the Department will ''Do work safely.'' The purpose of this River Protection Project (RPP) Integrated Safety Management System (ISMS) Phase II Verification was to determine whether ISMS programs and processes are implemented within RFP to accomplish the goal of ''Do work safely.'' The goal of an implemented ISMS is to have a single integrated system that includes Environment, Safety, and Health (ES&H) requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and federal property over the RPP life cycle. The ISMS is comprised of the (1) described functions, components, processes, and interfaces (system map or blueprint) and (2) personnel who are executing those assigned roles and responsibilities to manage and control the ISMS. Therefore, this review evaluated both the ''paper'' and ''people'' aspects of the ISMS to ensure that the system is implemented within RPP. Richland Operations Office (RL) conducted an ISMS Phase I Verification of the TWRS from September 28-October 9, 1998. The resulting verification report recommended that TWRS-RL and the contractor proceed with Phase II of ISMS verification given that the concerns identified from the Phase I verification review are incorporated into the Phase II implementation plan.

  11. Proceedings 7th International Workshop on Automated Specification and Verification of Web Systems

    CERN Document Server

    Kovacs, Laura; Tiezzi, Francesco

    2011-01-01

    This volume contains the final and revised versions of the papers presented at the 7th International Workshop on Automated Specification and Verification of Web Systems (WWV 2011). The workshop was held in Reykjavik, Iceland, on June 9, 2011, as part of DisCoTec 2011. The aim of the WWV workshop series is to provide an interdisciplinary forum to facilitate the cross-fertilization and the advancement of hybrid methods that exploit concepts and tools drawn from Rule-based programming, Software engineering, Formal methods and Web-oriented research. Nowadays, indeed, many companies and institutions have diverted their Web sites into interactive, completely-automated, Web-based applications for, e.g., e-business, e-learning, e-government, and e-health. The increased complexity and the explosive growth of Web systems have made their design and implementation a challenging task. Systematic, formal approaches to their specification and verification can permit to address the problems of this specific domain by means o...

  12. Verification of a Real Time Weather Forecasting System in Southern Italy

    Directory of Open Access Journals (Sweden)

    Luca Tiriolo

    2015-01-01

    Full Text Available This paper shows the performance of an operational forecasting system, based on the regional atmospheric modeling system (RAMS, at 3 km horizontal resolution over southern Italy. The model is initialized from the 12 UTC operational analysis/forecasting cycle of the European Centre for Medium range Weather Forecasts (ECMWF. The forecast is issued for the following three days. The performance is evaluated for a whole year for the surface parameters: temperature, relative humidity, wind speed and direction, and precipitation. The verification has been performed against SYNOP stations over southern Italy. A dense non-GTS network over Calabria is used for precipitation. Results show that RMSE is about 2-3 K for temperature, 12–16% for relative humidity, 2.0–2.8 m/s for wind speed, and 55–75° for wind direction, the performance varying with the season and with the forecasting time. The error increases between the first and third forecast days. The verification of the rainfall forecast shows that the model underestimates the area of the precipitation. The model output statistics (MOS is applied to all parameters but precipitation. Results show that the MOS reduces the RMSE by 0–30%, depending on the forecasting time, on the season and on the meteorological parameter.

  13. Reverse Monte Carlo ray-tracing for radiative heat transfer in combustion systems

    Science.gov (United States)

    Sun, Xiaojing

    Radiative heat transfer is a dominant heat transfer phenomenon in high temperature systems. With the rapid development of massive supercomputers, the Monte-Carlo ray tracing (MCRT) method starts to see its applications in combustion systems. This research is to find out if Monte-Carlo ray tracing can offer more accurate and efficient calculations than the discrete ordinates method (DOM). Monte-Carlo ray tracing method is a statistical method that traces the history of a bundle of rays. It is known as solving radiative heat transfer with almost no approximation. It can handle nonisotropic scattering and nongray gas mixtures with relative ease compared to conventional methods, such as DOM and spherical harmonics method, etc. There are two schemes in Monte-Carlo ray tracing method: forward and backward/reverse. Case studies and the governing equations demonstrate the advantages of reverse Monte-Carlo ray tracing (RMCRT) method. The RMCRT can be easily implemented for domain decomposition parallelism. In this dissertation, different efficiency improvements techniques for RMCRT are introduced and implemented. They are the random number generator, stratified sampling, ray-surface intersection calculation, Russian roulette, and important sampling. There are two major modules in solving the radiative heat transfer problems: the RMCRT RTE solver and the optical property models. RMCRT is first fully verified in gray, scattering, absorbing and emitting media with black/nonblack, diffuse/nondiffuse bounded surface problems. Sensitivity analysis is carried out with regard to the ray numbers, the mesh resolutions of the computational domain, optical thickness of the media and effects of variance reduction techniques (stratified sampling, Russian roulette). Results are compared with either analytical solutions or benchmark results. The efficiency (the product of error and computation time) of RMCRT has been compared to DOM and suggest great potential for RMCRT's application

  14. A Multitier System for the Verification, Visualization and Management of CHIMERA

    International Nuclear Information System (INIS)

    CHIMERA is a multi-dimensional radiation hydrodynamics code designed to study core-collapse supernovae. The code is made up of three essentially independent parts: a hydrodynamics module, a nuclear burning module, and a neutrino transport solver combined within an operator-split approach. Given CHIMERA's complexity and pace of ongoing development, a new support system, Bellerophon, has been designed and implemented to perform automated verification, visualization and management tasks while integrating with other workflow systems utilized by CHIMERA's development group. In order to achieve these goals, a multitier approach has been adopted. By integrating supercomputing platforms, visualization clusters, a dedicated web server and a client-side desktop application, this system attempts to provide an encapsulated, end-to-end solution to these needs.

  15. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  16. CARMEN: a system Monte Carlo based on linear programming from direct openings; CARMEN: Un sistema de planficiacion Monte Carlo basado en programacion lineal a partir de aberturas directas

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-07-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  17. Verification test for radiation reduction effect and material integrity on PWR primary system by zinc injection

    International Nuclear Information System (INIS)

    Zinc injection is known to be an effective method for the reduction of radiation source in the primary water system of a PWR. There is a need to verify the effect of Zn injection operation on radiation source reduction and materials integrity of PWR primary circuit. In order to confirm the effectiveness of Zn injection, verification test as a national program sponsored by Ministry of Economy, Trade and Industry (METI) was started in 1995 for 7-year program, and will be finished by the end of March in 2002. This program consists of irradiation test and material integrity test. Irradiation test as an In-Pile-Test managed by AEAT Plc(UK) was performed using the LVR-15 reactor of NRI Rez in Check Republic. Furthermore, Out-of-Pile-Test using film adding unit was also performed to obtain supplemental data for In-Pile-Test at Takasago Engineering Laboratory of NUPEC. Material Integrity test was planned to perform constant load test, constant strain test and corrosion test at the same time using large scale Loop and slow strain extension rate testing (SSRT) at Takasago Engineering Laboratory of NUPEC. In this paper, the results of the verification test for Zinc program at present are discussed. (authors)

  18. An independent system for real-time dynamic multileaf collimation trajectory verification using EPID

    International Nuclear Information System (INIS)

    A new tool has been developed to verify the trajectory of dynamic multileaf collimators (MLCs) used in advanced radiotherapy techniques using only the information provided by the electronic portal imaging devices (EPID) measured image frames. The prescribed leaf positions are resampled to a higher resolution in a pre-processing stage to improve the verification precision. Measured MLC positions are extracted from the EPID frames using a template matching method. A cosine similarity metric is then applied to synchronise measured and planned leaf positions for comparison. Three additional comparison functions were incorporated to ensure robust synchronisation. The MLC leaf trajectory error detection was simulated for both intensity modulated radiation therapy (IMRT) (prostate) and volumetric modulated arc therapy (VMAT) (head-and-neck) deliveries with anthropomorphic phantoms in the beam. The overall accuracy for MLC positions automatically extracted from EPID image frames was approximately 0.5 mm. The MLC leaf trajectory verification system can detect leaf position errors during IMRT and VMAT with a tolerance of 3.5 mm within 1 s. (paper)

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, C. LEE COOK DIVISION, DOVER CORPORATION, STATIC PAC (TM) SYSTEM, PHASE II REPORT

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Static Pac System, Phase II, natural gas reciprocating compressor rod packing manufactured by the C. Lee Cook Division, Dover Corporation. The Static Pac System is designed to seal th...

  20. Algorithm and application of Monte Carlo simulation for multi-dispersive copolymerization system

    Institute of Scientific and Technical Information of China (English)

    凌君; 沈之荃; 陈万里

    2002-01-01

    A Monte Carlo algorithm has been established for multi-dispersive copolymerization system, based on the experimental data of copolymer molecular weight and dispersion via GPC measurement. The program simulates the insertion of every monomer unit and records the structure and microscopical sequence of every chain in various lengths. It has been applied successfully for the ring-opening copolymerization of 2,2-dimethyltrimethylene carbonate (DTC) with (-caprolactone (ε-CL). The simulation coincides with the experimental results and provides microscopical data of triad fractions, lengths of homopolymer segments, etc., which are difficult to obtain by experiments. The algorithm presents also a uniform frame for copolymerization studies under other complicated mechanisms.

  1. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  2. A physical zero-knowledge object-comparison system for nuclear warhead verification

    Science.gov (United States)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco

    2016-09-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  3. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    CERN Document Server

    Ölveczky, Peter Csaba; 10.4204/EPTCS.36.8

    2010-01-01

    Distributed embedded systems (DESs) are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  4. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    Science.gov (United States)

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  5. A physical zero-knowledge object-comparison system for nuclear warhead verification

    Science.gov (United States)

    Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477

  6. A physical zero-knowledge object-comparison system for nuclear warhead verification.

    Science.gov (United States)

    Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco

    2016-09-20

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.

  7. Verification of the CFD simulation system SAUNA for complex aircraft configurations

    Science.gov (United States)

    Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.

    1994-04-01

    This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.

  8. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    OpenAIRE

    Tuo Ming Fu; Zhou Xing She; Guo Zheng Xin; Shan Li Jun

    2016-01-01

    The safety of Cyber-physical system(CPS) is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL). The formal definition of hybrid program(HP) is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the...

  9. Validation of MTF measurement for CBCT system using Monte Carlo simulations

    Science.gov (United States)

    Hao, Ting; Gao, Feng; Zhao, Huijuan; Zhou, Zhongxing

    2016-03-01

    To evaluate the spatial resolution performance of cone beam computed tomography (CBCT) system, accurate measurement of the modulation transfer function (MTF) is required. This accuracy depends on the MTF measurement method and CBCT reconstruction algorithms. In this work, the accuracy of MTF measurement of CBCT system using wire phantom is validated by Monte Carlo simulation. A Monte Carlo simulation software tool BEAMnrc/EGSnrc was employed to model X-ray radiation beams and transport. Tungsten wires were simulated with different diameters and radial distances from the axis of rotation. We adopted filtered back projection technique to reconstruct images from 360° acquisition. The MTFs for four reconstruction kernels were measured from corresponding reconstructed wire images, while the ram-lak kernel increased the MTF relative to the cosine, hamming and hann kernel. The results demonstrated that the MTF degraded radially from the axis of rotation. This study suggested that an increase in the MTF for the CBCT system is possible by optimizing scanning settings and reconstruction parameters.

  10. Blind receiver for OFDM systems via sequential Monte Carlo in factor graphs

    Institute of Scientific and Technical Information of China (English)

    CHEN Rong; ZHANG Hai-bin; XU You-yun; LIU Xin-zhao

    2007-01-01

    Estimation and detection algorithms for orthogonal frequency division multiplexing (OFDM) systems can be developed based on the sum-product algorithms, which operate by message passing in factor graphs. In this paper, we apply the sampling method (Monte Carlo) to factor graphs, and then the integrals in the sum-product algorithm can be approximated by sums, which results in complexity reduction. The blind receiver for OFDM systems can be derived via Sequential Monte Carlo(SMC) in factor graphs, the previous SMC blind receiver can be regarded as the special case of the sum-product algorithms using sampling methods. The previous SMC blind receiver for OFDM systems needs generating samples of the channel vector assuming the channel has an a priori Gaussian distribution. In the newly-built blind receiver, we generate samples of the virtual-pilots instead of the channel vector, with channel vector which can be easily computed based on virtual-pilots. As the size of the virtual-pilots space is much smaller than the channel vector space, only small number of samples are necessary, with the blind detection being much simpler. Furthermore, only one pilot tone is needed to resolve phase ambiguity and differential encoding is not used anymore. Finally, the results of computer simulations demonstrate that the proposal can perform well while providing significant complexity reduction.

  11. A Quantum Monte Carlo Study of mono(benzene)TM and bis(benzene)TM Systems

    CERN Document Server

    Bennett, M Chandler; Mitas, Lubos

    2016-01-01

    We present a study of mono(benzene)TM and bis(benzene)TM systems, where TM={Mo,W}. We calculate the binding energies by quantum Monte Carlo (QMC) approaches and compare the results with other methods and available experiments. The orbitals for the determinantal part of each trial wave function were generated from several types of DFT in order to optimize for fixed-node errors. We estimate and compare the size of the fixed-node errors for both the Mo and W systems with regard to the electron density and degree of localization in these systems. For the W systems we provide benchmarking results of the binding energies, given that experimental data is not available.

  12. Performance and economic risk evaluation of dispersed solar thermal power systems by Monte Carlo simulation

    Science.gov (United States)

    Manvi, R.; Fujita, T.

    1978-01-01

    A preliminary comparative evaluation of dispersed solar thermal power plants utilizing advanced technologies available in 1985-2000 time frame is under way at JPL. The solar power plants of 50 KWe to 10 MWe size are equipped with two axis tracking parabolic dish concentrator systems operating at temperatures in excess of 1000 F. The energy conversion schemes under consideration include advanced steam, open and closed cycle gas turbines, stirling, and combined cycle. The energy storage systems include advanced batteries, liquid metal, and chemical. This paper outlines a simple methodology for a probabilistic assessment of such systems. Sources of uncertainty in the development of advanced systems are identified, and a computer Monte Carlo simulation is exercised to permit an analysis of the tradeoffs of the risk of failure versus the potential for large gains. Frequency distribution of energy cost for several alternatives are presented.

  13. Principle of Line Configuration and Monte-Carlo Simulation for Shared Multi-Channel System

    Institute of Scientific and Technical Information of China (English)

    MIAO Changyun; DAI Jufeng; BAI Zhihui

    2005-01-01

    Based on the steady-state solution of finite-state birth and death process, the principle of line configuration for shared multi-channel system is analyzed. Call congestion ratio equation and channel utilization ratio equation are deduced, and visualized data analysis is presented. The analy-sis indicates that, calculated with the proposed equations, the overestimate for call congestion ratio and channel utilization ratio can be rectified, and thereby the cost of channels can be saved by 20% in a small system.With MATLAB programming, line configuration methods are provided. In order to generally and intuitively show the dynamic running of the system, and to analyze,promote and improve it, the system is simulated using M/M/n/n/m queuing model and Monte-Carlo method. In addition, the simulation validates the correctness of the theoretical analysis and optimizing configuration method.

  14. A remote verification system for international safeguards: Status of the RECOVER programme in the IAEA

    International Nuclear Information System (INIS)

    The paper describes the current status of the Remote Continual Verification (RECOVER) programme being conducted in the Agency. The programme which started in 1979 with support from the U.S. Arms Control and Disarmament Agency has, as participants, several other Member States besides the U.S.A., namely Australia, Bulgaria, Canada, the Federal Republic of Germany, Japan and the United Kingdom, the aim being to demonstrate a new system designed to improve the efficiency and effectiveness of international safeguards. The paper briefly explains the RECOVER concept and the functions of the various components of the system such as the Monitoring Unit (MU), On-Site Multiplexer (OSM) and the Resident Verification Unit (RVU). Results obtained since the installation of the RVU at IAEA Headquarters at the beginning of 1980 are outlined. The feasibility of secure transmission data via public telephone lines was demonstrated. In the initial design these data were supposed to be status-data of containment/surveillance devices such as cameras, seals, CCTV. A later modification permitted the transmission of alpha-numerical texts such as certain inspection forms via RECOVER. With the progress of testing and demonstrating the RECOVER system it was felt necessary to evaluate the cost and benefits of implementing RECOVER for IAEA safeguards. A draft report prepared by the Brookhaven National Laboratory on this subject was the main topic of discussion at the last meeting of RECOVER participants held in June 1982. The paper concludes by identifying certain tasks that need to be successfully completed before RECOVER is ready for routine use by the Agency. (author)

  15. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314 Tank Farm Restoration and Safe Operations

    Energy Technology Data Exchange (ETDEWEB)

    MCGREW, D.L.

    1999-09-28

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate.

  16. Clinical evaluation of fast electron Monte Carlo dose calculation algorithms for treatment planning systems validation with experimental measurements and EGSnrc Monte Carlo simulation

    OpenAIRE

    Edimo, Paul

    2012-01-01

    The present study is focused on the clinical validation of two electron Monte Carlo (eMC) based treatment planning systems (TPS), Oncentra MasterPlan TPS (OMTPS) and XiO eMC. We present a new approach on the commissioning process based on, (a) homogeneous water phantom validation, (b) heterogeneous phantom validation with film measurements and, (c) Full MC validation. As a first step, MC models of electron beams (4, 8, 12 and 18 MeV) from an Elekta SL25 medical linear accelerator were buil...

  17. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    International Nuclear Information System (INIS)

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  18. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F [UniversityKansas Medical Center, Kansas City, KS (United States)

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  19. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    Science.gov (United States)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  20. Monte Carlo calculations on the magnetization profile and domain wall structure in bulk systems and nanoconstricitons

    Energy Technology Data Exchange (ETDEWEB)

    Serena, P. A. [Instituto de Ciencias de Materiales de Madrid, Madrid (Spain); Costa-Kraemer, J. L. [Instituto de Microelectronica de Madrid, Madrid (Spain)

    2001-03-01

    A Monte Carlo algorithm suitable to study systems described by an anisotropic Heisenberg Hamiltonian is presented. This technique has been tested successfully with 3D and 2D systems, illustrating how magnetic properties depend on the dimensionality and the coordination number. We have found that magnetic properties of constrictions differ from those appearing in bulk. In particular, spin fluctuations are considerable larger than those calculated for bulk materials. In addition, domain walls are strongly modified when a constriction is present, with a decrease of the domain-wall width. This decrease is explained in terms of previous theoretical works. [Spanish] Se presenta un algoritmo de Monte Carlo para estudiar sistemas discritos por un hamiltoniano anisotropico de Heisenburg. Esta tecnica ha sido probada exitosamente con sistemas de dos y tres dimensiones, ilustrado con las propiedades magneticas dependen de la dimensionalidad y el numero de coordinacion. Hemos encontrado que las propiedades magneticas de constricciones difieren de aquellas del bulto. En particular, las fluctuaciones de espin son considerablemente mayores. Ademas, las paredes de dominio son fuertemente modificadas cuando una construccion esta presente, originando un decrecimiento del ancho de la pared de dominio. Damos cuenta de este decrecimiento en terminos de un trabajo teorico previo.

  1. FOCUS, Neutron Transport System for Complex Geometry Reactor Core and Shielding Problems by Monte-Carlo

    International Nuclear Information System (INIS)

    1 - Description of problem or function: FOCUS enables the calculation of any quantity related to neutron transport in reactor or shielding problems, but was especially designed to calculate differential quantities, such as point values at one or more of the space, energy, direction and time variables of quantities like neutron flux, detector response, reaction rate, etc. or averages of such quantities over a small volume of the phase space. Different types of problems can be treated: systems with a fixed neutron source which may be a mono-directional source located out- side the system, and Eigen function problems in which the neutron source distribution is given by the (unknown) fundamental mode Eigen function distribution. Using Monte Carlo methods complex 3- dimensional geometries and detailed cross section information can be treated. Cross section data are derived from ENDF/B, with anisotropic scattering and discrete or continuous inelastic scattering taken into account. Energy is treated as a continuous variable and time dependence may also be included. 2 - Method of solution: A transformed form of the adjoint Boltzmann equation in integral representation is solved for the space, energy, direction and time variables by Monte Carlo methods. Adjoint particles are defined with properties in some respects contrary to those of neutrons. Adjoint particle histories are constructed from which estimates are obtained of the desired quantity. Adjoint cross sections are defined with which the nuclide and reaction type are selected in a collision. The energy after a collision is selected from adjoint energy distributions calculated together with the adjoint cross sections in advance of the actual Monte Carlo calculation. For multiplying systems successive generations of adjoint particles are obtained which will die out for subcritical systems with a fixed neutron source and will be kept approximately stationary for Eigen function problems. Completely arbitrary problems can

  2. A ROBUST GA/KNN BASED HYPOTHESIS VERIFICATION SYSTEM FOR VEHICLE DETECTION

    Directory of Open Access Journals (Sweden)

    Nima Khairdoost

    2015-03-01

    Full Text Available Vehicle detection is an important issue in driver assistance systems and self-guided vehicles that includes two stages of hypothesis generation and verification. In the first stage, potential vehicles are hypothesized and in the second stage, all hypothesis are verified. The focus of this work is on the second stage. We extract Pyramid Histograms of Oriented Gradients (PHOG features from a traffic image as candidates of feature vectors to detect vehicles. Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA are applied to these PHOG feature vectors as dimension reduction and feature selection tools parallelly. After feature fusion, we use Genetic Algorithm (GA and cosine similarity-based K Nearest Neighbor (KNN classification to improve the performance and generalization of the features. Our tests show good classification accuracy of more than 97% correct classification on realistic on-road vehicle images.

  3. Tank waste remediation system FSAR hazard identification/facility configuration verification report

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, D.P., Westinghouse Hanford

    1996-05-01

    This document provides the results of the Tank Waste Remediation System Final Safety Analysis Report (TWRS FSAR) hazards identification/facility configuration activities undertaken from the period of March 7, 1996 to May 31, 1996. The purpose of this activity was to provide an independent overview of the TWRS facility specific hazards and configurations that were used in support of the TWRS FSAR hazards and accident analysis development. It was based on a review of existing published documentation and field inspections. The objective of the verification effort was to provide a `snap shot` in time of the existing TWRS facility hazards and configurations and will be used to support hazards and accident analysis activities.

  4. A physical zero-knowledge object comparison system for nuclear warhead verification

    CERN Document Server

    Philippe, Sébastien; Glaser, Alexander; d'Errico, Francesco

    2016-01-01

    Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. In addition to the use of such a technique as part of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information, we provide definitive evidence that measuring sensitive data is not required to perform comparisons of physical properties.

  5. Parallel and optimized genetic Elman network for 252Cf source-driven verification system

    Institute of Scientific and Technical Information of China (English)

    冯鹏; 魏彪; 金晶

    2015-01-01

    The 252Cf source-driven verification system (SDVS) can recognize the enrichment of fissile material with the enrichment-sensitive autocorrelation functions of a detector signal in 252Cf source-driven noise-analysis (SDNA) measurements. We propose a parallel and optimized genetic Elman network (POGEN) to identify the enrich-ment of 235U based on the physical properties of the measured autocorrelation functions. Theoretical analysis and experimental results indicate that, for 4 different enrichment fissile materials, due to higher information utilization, more efficient network architecture, and optimized parameters, the POGEN-based algorithm can ob-tain identification results with higher recognition accuracy, compared to the integrated autocorrelation function (IAF) method.

  6. SU-E-T-600: Patient Specific IMRT Verification Using a Phosphor-Screen Based Geometric QA System: A Preliminary Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Hu, E; Yi, B [Univ. of Maryland School Of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: Raven QA (JPLC, MD) is a unified and comprehensive quality assurance system for QA of TG-142, which use a phosphor screen, a mirror system and a camera. It is to test if this device can be used for IMRT QA dosimetry. Methods: A lung IMRT case is used deliver dose to Raven QA. Accuracy of dose distribution of 5cm slab phantom using Eclipse planning system (Varian) has been confirmed both from a Monte Carlo Simulation and from a MapCheck (SunNuclear) measurement. Geometric distortion and variation of spatial dose response are corrected after background subtraction. A pin-hole grid plate is designed and used to determine the light scatter in the Raven QA box and the spatial dose response. Optic scatter model was not applied in this preliminary study. Dose is normalized to the response of the 10×10 field and the TMR of 5cm depth was considered. Results: Time to setup the device for IMRT QA takes less than 5 minutes as other commercially available devices. It shows excellent dose linearity and dose rate independent, less than 1 %. Background signal, however, changes for different field sizes. It is believed to be due to inaccurate correction of optic scatter. Absolute gamma (5%, 5mm) passing rate was higher than 95%. Conclusion: This study proves that the Raven QA can be used for a patient specific IMRT verification. Part of this study is supported by the Maryland Industrial Partnership Grant.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT-A AND A ENVIRONMENTAL SEALS, INC., SEAL ASSIST SYSTEM (SAS) PHASE II REPORT

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Seal Assist System (SAS) for natural gas reciprocating compressor rod packing manufactured by A&A Environmental Seals, Inc. The SAS uses a secondary containment gland to collect natural g...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: RESIDENTIAL ELECTRIC POWER GENERATION USING THE PLUG POWER SU1 FUEL CELL SYSTEM

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Plug Power SU1 Fuel Cell System manufactured by Plug Power. The SU1 is a proton exchange membrane fuel cell that requires hydrogen (H2) as fuel. H2 is generally not available, so the ...

  9. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ... sampling system at the same time. If you use any analog or real-time digital filters during emission testing, you must operate those filters in the same manner during this verification. (2) Equipment setup... same time. In designing your experimental setup, avoid pressure pulsations due to stopping the...

  10. Video-based cargo fire verification system with fuzzy inference engine for commercial aircraft

    Science.gov (United States)

    Sadok, Mokhtar; Zakrzewski, Radek; Zeliff, Bob

    2005-02-01

    Conventional smoke detection systems currently installed onboard aircraft are often subject to high rates of false alarms. Under current procedures, whenever an alarm is issued the pilot is obliged to release fire extinguishers and to divert to the nearest airport. Aircraft diversions are costly and dangerous in some situations. A reliable detection system that minimizes false-alarm rate and allows continuous monitoring of cargo compartments is highly desirable. A video-based system has been recently developed by Goodrich Corporation to address this problem. The Cargo Fire Verification System (CFVS) is a multi camera system designed to provide live stream video to the cockpit crew and to perform hotspot, fire, and smoke detection in aircraft cargo bays. In addition to video frames, the CFVS uses other sensor readings to discriminate between genuine events such as fire or smoke and nuisance alarms such as fog or dust. A Mamdani-type fuzzy inference engine is developed to provide approximate reasoning for decision making. In one implementation, Gaussian membership functions for frame intensity-based features, relative humidity, and temperature are constructed using experimental data to form the system inference engine. The CFVS performed better than conventional aircraft smoke detectors in all standardized tests.

  11. Automated Verification of Memory Consistencies of DSM System on Unified Framework

    Directory of Open Access Journals (Sweden)

    Dr. Pankaj Kumar , Durgesh Kumar

    2012-12-01

    Full Text Available The consistency model of a DSM system specifies the ordering constraints on concurrent memory accesses by multiple processors, and hence has fundamental impact on DSM systems’ programming convenience and implementation efficiency. We have proposed the structural model for automated verification of memory consistencies of DSM System. DSM allows processes to assume a globally shared virtual memory even though they execute on nodes that do not physically share memory. The DSM software provide the abstraction of a globally shared memory in which each processor can access any data item without the programmer having to worry about where the data is or how to obtain its value In contrast in the native programming model on networks of workstations message passing the programmer must decide when a processor needs to communicate with whom to communicate and what data to be send. On a DSM system the programmer can focus on algorithmic development rather than on managing partitioned data sets and communicating values. The programming interfaces to DSM systems may differ in a variety of respects. The memory model refers to how updates to distributed shared memory are rejected to the processes in the system. The most intuitive model of distributed shared memory is that a read should always return the last value written unfortunately the notion of the last value written is not well defined in a distributed system.

  12. Monte Carlo filters for identification of nonlinear structural dynamical systems

    Indian Academy of Sciences (India)

    C S Manohar; D Roy

    2006-08-01

    The problem of identification of parameters of nonlinear structures using dynamic state estimation techniques is considered. The process equations are derived based on principles of mechanics and are augmented by mathematical models that relate a set of noisy observations to state variables of the system. The set of structural parameters to be identified is declared as an additional set of state variables. Both the process equation and the measurement equations are taken to be nonlinear in the state variables and contaminated by additive and (or) multiplicative Gaussian white noise processes. The problem of determining the posterior probability density function of the state variables conditioned on all available information is considered. The utility of three recursive Monte Carlo simulation-based filters, namely, a probability density function-based Monte Carlo filter, a Bayesian bootstrap filter and a filter based on sequential importance sampling, to solve this problem is explored. The state equations are discretized using certain variations of stochastic Taylor expansions enabling the incorporation of a class of non-smooth functions within the process equations. Illustrative examples on identification of the nonlinear stiffness parameter of a Duffing oscillator and the friction parameter in a Coulomb oscillator are presented.

  13. Observation and Confirmation of Six Strong Lensing Systems in The Dark Energy Survey Science Verification Data

    Energy Technology Data Exchange (ETDEWEB)

    Nord, B.; et al.

    2015-12-09

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey (DES) data. Through visual inspection of data from the Science Verification (SV) season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-Object Spectrograph (GMOS) at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph (IMACS) at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: Three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 were either not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy cluster-scale lenses. The lensed sources range in redshift z ~ 0.80-3.2, and in i-band surface brightness i_{SB} ~ 23-25 mag/sq.-arcsec. (2" aperture). For each of the six systems, we estimate the Einstein radius and the enclosed mass, which have ranges ~ 5.0 - 8.6" and ~ 7.5 x 10^{12} - 6.4 x 10^{13} solar masses, respectively.

  14. Architecture and critical technologies of seismic information system in CTBT verification

    Institute of Scientific and Technical Information of China (English)

    ZHENG Xue-feng; SHEN Jun-yi; JIN Ping; ZHENG Jiang-ling; SUN Peng; ZHANG Hui-min; WANG Tong-dong

    2006-01-01

    Seismic monitoring is one of the most important approaches for ground-based nuclear explosion monitoring. In order to improve the monitoring capability for low magnitude seismic events, a seismic information system was developed by using the technologies of geographic information system and database. This paper describes the designing and critical technologies of the Seismic Information System in CTBT Verification developed based on ArcGIS and ORACLE platforms. It is a combination of the database storage framework, application programming interface and graphic application software for users to meet their monitoring objectives. Combining the ArcSDE Geodatabase, RDBMS ORACLE and ArcObjects developing technique on COM, not only the multi-sources data has been seamlessly integrated, but also the most functions of ORACLE, for example, consistency, concurrent access, security mechanism, etc, have been reserved. For easy access to the information system we develop two different mechanisms. The first is a menu-driven internal system that is run on NT platforms. The second access mechanism is based on LAN and easily accessible by any web browsers.

  15. Monte Carlo simulation of the neutron guide system for the SNS engineering diffractometer

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.L.; Lee, W.T. [Spallation Neutron Source, Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2001-03-01

    VULCAN, the SNS engineering diffractometer, is designed to tackle a broad range of engineering problems, from residual stress distribution in components to materials response under loading. In VULCAN, neutrons are delivered to the sample position via a series of straight and curved neutron guides. An interchangeable guide-collimator system is planned in the incident beam path, allowing the instrument to be optimally configured for individual experiments with different intensity-resolution requirements. To achieve maximum data rate and large d-spacing coverage, detectors are employed continuously from 60deg to 150deg in the horizontal scattering plane and -30deg to 30deg in the vertical plane. To enable simultaneous small angle scattering measurements for characterization of the microstructure, the instrument is also equipped with a position sensitive area detector. Monte Carlo simulation indicates that the proposed neutron guide system is able to deliver the desired intensity and resolution. (author)

  16. 3-D Monte Carlo analyses of shielding system in tokamak fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gallina, M.; Petrizzi, L.; Rado, V.

    1990-09-01

    Within the framework of the ITER (International Tokamak Experimental Reactor) design program, 3D neutronics calculations were carried out to assess system shielding performances in the basic machine configuration by means of the Monte Carlo Neutron Photon (MCNP) code (3-B version). The main issue concerns the estimation of the nuclear heat and radiation loads on the toroidal field superconducting coils. 'Self generated weight windows' (w.w.) and source biasing techniques were used to treat the deep penetration through the bulk shield and streaming through the system gaps and openings. The main results are reported together with a discussion of the computing methods, especially of the variance reduction techniques adopted.

  17. 3-D Monte Carlo analyses of the shielding system in a tokamak fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gallina, M.; Petrizzi, L.; Rado, V. (ENEA, Frascati (Italy). Centro Ricerche Energia)

    1990-01-01

    As part of the ITER (International Tokamak Experimental Reactor) design program, 3D neutronics calculations have been carried out to assess the shielding system performance in the basic machine configuration by means of the Monte Carlo Neutron Photon (MCNP) transport code (3-B version). The main issue is the estimation of the nuclear heat and radiation loads on the toroidal field superconducting coils. ''Self generated weight windows'' and source biasing technique have been used to treat deep penetration through the bulk shield and streaming through the system gaps and openings. The main results are reported together with a discussion of the computing methods, especially of the variance reduction techniques adopted. (author).

  18. Comparative study among simulations of an internal monitoring system using different Monte Carlo codes

    International Nuclear Information System (INIS)

    Computational Monte Carlo (MC) codes have been used for simulation of nuclear installations mainly for internal monitoring of workers, the well known as Whole Body Counters (WBC). The main goal of this project was the modeling and simulation of the counting efficiency (CE) of a WBC system using three different MC codes: MCNPX, EGSnrc and VMC in-vivo. The simulations were performed for three different groups of analysts. The results shown differences between the three codes, as well as in the results obtained by the same code and modeled by different analysts. Moreover, all the results were also compared to the experimental results obtained in laboratory for meaning of validation and final comparison. In conclusion, it was possible to detect the influence on the results when the system is modeled by different analysts using the same MC code and in which MC code the results were best suited, when comparing to the experimental data result. (author)

  19. COLLI-PTB, Neutron Fluence Spectra for 3-D Collimator System by Monte-Carlo

    International Nuclear Information System (INIS)

    1 - Description of program or function: For optimizing collimator systems (shieldings) for fast neutrons with energies between 10 KeV and 20 MeV. Only elastic and inelastic neutron scattering processes are involved. Isotropic angular distribution for inelastic scattering in the center of mass system is assumed. 2 - Method of solution: The Monte Carlo method with importance sampling technique, splitting and Russian Roulette is used. The neutron attenuation and scattering kinematics is taken into account. 3 - Restrictions on the complexity of the problem: Energy range from 10 KeV to 20 MeV. For the output spectra any bin width is possible. The output spectra are confined to 40 equidistant channels

  20. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography.

    Science.gov (United States)

    Saha, Krishnendu; Straus, Kenneth J; Chen, Yu; Glick, Stephen J

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  1. Observation and Confirmation of Six Strong Lensing Systems in The Dark Energy Survey Science Verification Data

    CERN Document Server

    Nord, B; Lin, H; Diehl, H T; Helsby, J; Kuropatkin, N; Amara, A; Collett, T; Allam, S; Caminha, G; De Bom, C; Desai, S; Dúmet-Montoya, H; Pereira, M Elidaiana da S; Finley, D A; Flaugher, B; Furlanetto, C; Gaitsch, H; Gill, M; Merritt, K W; More, A; Tucker, D; Rykoff, E S; Rozo, E; Abdalla, F B; Agnello, A; Auger, M; Brunner, R J; Kind, M Carrasco; Castander, F J; Cunha, C E; da Costa, L N; Foley, R; Gerdes, D W; Glazebrook, K; Gschwend, J; Hartley, W; Kessler, R; Lagattuta, D; Lewis, G; Maia, M A G; Makler, M; Menanteau, F; Niernberg, A; Scolnic, D; Vieira, J D; Gramillano, R; Abbott, T M C; Banerji, M; Benoit-Lévy, A; Brooks, D; Burke, D L; Capozzi, D; Rosell, A Carnero; Carretero, J; D'Andrea, C B; Dietrich, J P; Doel, P; Evrard, A E; Frieman, J; Gaztanaga, E; Gruen, D; Honscheid, K; James, D J; Kuehn, K; Li, T S; Lima, M; Marshall, J L; Martini, P; Melchior, P; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Sako, M; Sanchez, E; Scarpine, V; Schubnell, M; Sevilla-Noarbe, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Swanson, M E C; Tarle, G; Thaler, J; Walker, A R; Wester, W; Zhang, Y

    2015-01-01

    We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey (DES) data. Through visual inspection of data from the Science Verification (SV) season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-Object Spectrograph (GMOS) at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph (IMACS) at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: Three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 were either not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy cluster-scale lenses. The lensed sources range in redshift z ~ 0.80-3.2...

  2. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons

    2009-07-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  3. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    International Nuclear Information System (INIS)

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  4. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  5. ESTERR-PRO: A Setup Verification Software System Using Electronic Portal Imaging

    Directory of Open Access Journals (Sweden)

    Pantelis A. Asvestas

    2007-01-01

    Full Text Available The purpose of the paper is to present and evaluate the performance of a new software-based registration system for patient setup verification, during radiotherapy, using electronic portal images. The estimation of setup errors, using the proposed system, can be accomplished by means of two alternate registration methods. (a The portal image of the current fraction of the treatment is registered directly with the reference image (digitally reconstructed radiograph (DRR or simulator image using a modified manual technique. (b The portal image of the current fraction of the treatment is registered with the portal image of the first fraction of the treatment (reference portal image by applying a nearly automated technique based on self-organizing maps, whereas the reference portal has already been registered with a DRR or a simulator image. The proposed system was tested on phantom data and on data from six patients. The root mean square error (RMSE of the setup estimates was 0.8±0.3 (mean value ± standard deviation for the phantom data and 0.3±0.3 for the patient data, respectively, by applying the two methodologies. Furthermore, statistical analysis by means of the Wilcoxon nonparametric signed test showed that the results that were obtained by the two methods did not differ significantly (P value >0.05.

  6. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  7. Kinetic Monte Carlo simulation of dopant-defect systems under submicrosecond laser thermal processes

    Energy Technology Data Exchange (ETDEWEB)

    Fisicaro, G.; Pelaz, Lourdes; Lopez, P.; Italia, M.; Huet, K.; Venturini, J.; La Magna, A. [CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy); Department of Electronics, University of Valladolid, 47011 Valladolid (Spain); CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy); Excico 13-21 Quai des Gresillons, 92230 Gennevilliers (France); CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy)

    2012-11-06

    An innovative Kinetic Monte Carlo (KMC) code has been developed, which rules the post-implant kinetics of the defects system in the extremely far-from-the equilibrium conditions caused by the laser irradiation close to the liquid-solid interface. It considers defect diffusion, annihilation and clustering. The code properly implements, consistently to the stochastic formalism, the fast varying local event rates related to the thermal field T(r,t) evolution. This feature of our numerical method represents an important advancement with respect to current state of the art KMC codes. The reduction of the implantation damage and its reorganization in defect aggregates are studied as a function of the process conditions. Phosphorus activation efficiency, experimentally determined in similar conditions, has been related to the emerging damage scenario.

  8. Monte Carlo Studies for the Calibration System of the GERDA Experiment

    CERN Document Server

    Baudis, Laura; Froborg, Francis; Tarka, Michal

    2013-01-01

    The GERmanium Detector Array, GERDA, searches for neutrinoless double beta decay in Ge-76 using bare high-purity germanium detectors submerged in liquid argon. For the calibration of these detectors gamma emitting sources have to be lowered from their parking position on top of the cryostat over more than five meters down to the germanium crystals. With the help of Monte Carlo simulations, the relevant parameters of the calibration system were determined. It was found that three Th-228 sources with an activity of 20 kBq each at two different vertical positions will be necessary to reach sufficient statistics in all detectors in less than four hours of calibration time. These sources will contribute to the background of the experiment with a total of (1.07 +/- 0.04(stat) +0.13 -0.19(sys)) 10^{-4} cts/(keV kg yr) when shielded from below with 6 cm of tantalum in the parking position.

  9. Validation and simulation of a regulated survey system through Monte Carlo techniques

    Directory of Open Access Journals (Sweden)

    Asier Lacasta Soto

    2015-07-01

    Full Text Available Channel flow covers long distances and obeys to variable temporal behaviour. It is usually regulated by hydraulic elements as lateralgates to provide a correct of water supply. The dynamics of this kind of flow is governed by a partial differential equations systemnamed shallow water model. They have to be complemented with a simplified formulation for the gates. All the set of equations forma non-linear system that can only be solved numerically. Here, an explicit upwind numerical scheme in finite volumes able to solveall type of flow regimes is used. Hydraulic structures (lateral gates formulation introduces parameters with some uncertainty. Hence,these parameters will be calibrated with a Monte Carlo algorithm obtaining associated coefficients to each gate. Then, they will bechecked, using real cases provided by the monitorizing equipment of the Pina de Ebro channel located in Zaragoza.

  10. Monte Carlo simulation of glandular dose in a dedicated breast CT system

    Institute of Scientific and Technical Information of China (English)

    TANG Xiao; WEI Long; ZHAO Wei; WANG Yan-Fang; SHU Hang; SUN Cui-Li; WEI Cun-Feng; CAO Da-Quan; QUE Jie-Min; SHI Rong-Jian

    2012-01-01

    A dedicated breast CT system (DBCT) is a new method for breast cancer detection proposed in recent years.In this paper,the glandular dose in the DBCT is simulated using the Monte Carlo method.The phantom shape is half ellipsoid,and a series of phantoms with different sizes,shapes and compositions were constructed. In order to optimize the spectra,monoenergy X-ray beams of 5-80 keV were used in simulation.The dose distribution of a breast phantom was studied:a higher energy beam generated more uniform distribution,and the outer parts got more dose than the inner parts.For polyenergtic spectra,four spectra of Al filters with different thicknesses were simulated,and the polyenergtic glandular dose was calculated as a spectral weighted combination of the monoenergetic dose.

  11. Algorithm and application of Monte Carlo simulation for multi-dispersive copolymerization system

    Institute of Scientific and Technical Information of China (English)

    凌君; 沈之荃; 陈万里

    2002-01-01

    A Monte Carlo algorithm has been established for multi-dispersive copolymerization system, based on the experimental data of copolymer molecular weight and dispersion via GPC measurement. The program simulates the insertion of every monomer unit and records the structure and microscopical sequence of every chain in various lengths. It has been applied successfully for the ring-opening copolymerization of 2,2-dimethyltrimethylene carbonate (DTC) with δ-caprolactone (δ-CL). The simulation coincides with the experimental results and provides microscopical data of triad fractions, lengths of homopolymer segments, etc., which are difficult to obtain by experiments. The algorithm presents also a uniform frame for copolymerization studies under other complicated mechanisms.

  12. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  13. Monte Carlo Techniques for the Comprehensive Modeling of Isotopic Inventories in Future Nuclear Systems and Fuel Cycles. Final Report

    International Nuclear Information System (INIS)

    The development of Monte Carlo techniques for isotopic inventory analysis has been explored in order to facilitate the modeling of systems with flowing streams of material through varying neutron irradiation environments. This represents a novel application of Monte Carlo methods to a field that has traditionally relied on deterministic solutions to systems of first-order differential equations. The Monte Carlo techniques were based largely on the known modeling techniques of Monte Carlo radiation transport, but with important differences, particularly in the area of variance reduction and efficiency measurement. The software that was developed to implement and test these methods now provides a basis for validating approximate modeling techniques that are available to deterministic methodologies. The Monte Carlo methods have been shown to be effective in reproducing the solutions of simple problems that are possible using both stochastic and deterministic methods. The Monte Carlo methods are also effective for tracking flows of materials through complex systems including the ability to model removal of individual elements or isotopes in the system. Computational performance is best for flows that have characteristic times that are large fractions of the system lifetime. As the characteristic times become short, leading to thousands or millions of passes through the system, the computational performance drops significantly. Further research is underway to determine modeling techniques to improve performance within this range of problems. This report describes the technical development of Monte Carlo techniques for isotopic inventory analysis. The primary motivation for this solution methodology is the ability to model systems of flowing material being exposed to varying and stochastically varying radiation environments. The methodology was developed in three stages: analog methods which model each atom with true reaction probabilities (Section 2), non-analog methods

  14. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  15. Spacecraft-level verification of the Van Allen Probes' RF communication system

    Science.gov (United States)

    Crowne, M. J.; Srinivasan, D.; Royster, D.; Weaver, G.; Matlin, D.; Mosavi, N.

    This paper presents the verification process, lessons learned, and selected test results of the radio frequency (RF) communication system of the Van Allen Probes, formerly known as the Radiation Belt Storm Probes (RBSP). The Van Allen Probes mission is investigating the doughnut-shaped regions of space known as the Van Allen radiation belts where the Sun interacts with charged particles trapped in Earth's magnetic field. Understanding this dynamic area that surrounds our planet is important to improving our ability to design spacecraft and missions for reliability and astronaut safety. The Van Allen Probes mission features two nearly identical spacecraft designed, built, and operated by the Johns Hopkins University Applied Physics Laboratory (JHU/APL) for the National Aeronautics and Space Administration (NASA). The RF communication system features the JHU/APL Frontier Radio. The Frontier Radio is a software-defined radio (SDR) designed for spaceborne communications, navigation, radio science, and sensor applications. This mission marks the first spaceflight usage of the Frontier Radio. RF ground support equipment (RF GSE) was developed using a ground station receiver similar to what will be used in flight and whose capabilities provided clarity into RF system performance that was previously not obtained until compatibility testing with the ground segments. The Van Allen Probes underwent EMC, acoustic, vibration, and thermal vacuum testing at the environmental test facilities at APL. During this time the RF communication system was rigorously tested to ensure optimal performance, including system-level testing down to threshold power levels. Compatibility tests were performed with the JHU/APL Satellite Communication Facility (SCF), the Universal Space Network (USN), and the Tracking and Data Relay Satellite System (TDRSS). Successful completion of this program as described in this paper validated the design of the system and demonstrated that it will be able to me

  16. An Effective Verification and Validation Strategy for Safety-Critical Embedded Systems

    Directory of Open Access Journals (Sweden)

    Manju Nanda

    2013-04-01

    Full Text Available This paper presents the best practices to carry out the verification and validation (V&V for a safety-critical embedded system, part of a larger system-of-systems. The paper talks about the effectiveness of thisstrategy from performance and time schedule requirement of a project. The best practices employed fortheV &Vis a modification of the conventional V&V approach. The proposed approach is iterative whichintroduces new testing methodologies apart from the conventional testing methodologies, an effective wayof implementing the phases of the V&V and also analyzing the V&V results. The new testing methodologiesinclude the random and non-real time testing apart from the static and dynamic tests. The process phasesare logically carried out in parallel and credit of the results of the different phases are takento ensure thatthe embedded system that goes for thefield testing is bug free. The paper also demonstrates the iterativequalities of the process where the iterations successivelyfind faults in the embedded system and executingthe process within a stipulated time frame, thus maintaining the required reliability of the system. Thisapproach is implemented in the most critical applications—-aerospace application where safety of thesystem cannot be compromised. The approach used afixed number of iterationswhich is set to4in thisapplication, with each iteration adding to the reliability and safety of the embedded system. Data collectedand results observed are compared with a conventional approach for the same application and it isdemonstrated that the strategy proposed reduces the time taken by 50% as compared to a conventionalprocess that attains the same reliability as required in the stipulated time

  17. Quality control of the treatment planning systems dose calculations in external radiation therapy using the Penelope Monte Carlo code; Controle qualite des systemes de planification dosimetrique des traitements en radiotherapie externe au moyen du code Monte-Carlo Penelope

    Energy Technology Data Exchange (ETDEWEB)

    Blazy-Aubignac, L

    2007-09-15

    The treatment planning systems (T.P.S.) occupy a key position in the radiotherapy service: they realize the projected calculation of the dose distribution and the treatment duration. Traditionally, the quality control of the calculated distribution doses relies on their comparisons with dose distributions measured under the device of treatment. This thesis proposes to substitute these dosimetry measures to the profile of reference dosimetry calculations got by the Penelope Monte-Carlo code. The Monte-Carlo simulations give a broad choice of test configurations and allow to envisage a quality control of dosimetry aspects of T.P.S. without monopolizing the treatment devices. This quality control, based on the Monte-Carlo simulations has been tested on a clinical T.P.S. and has allowed to simplify the quality procedures of the T.P.S.. This quality control, in depth, more precise and simpler to implement could be generalized to every center of radiotherapy. (N.C.)

  18. System and Component Software Specification, Run-time Verification and Automatic Test Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: PAINT OVERSPRAY ARRESTOR, ATI OSM 200 SYSTEM

    Science.gov (United States)

    Paint overspray arrestors (POAs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the particle filtration efficiency as a function of size for particles smaller than...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ECR TECHNOLOGIES, INC., EARTHLINKED GROUND-SOURCE HEAT PUMP WATER HEATING SYSTEM

    Science.gov (United States)

    EPA has created the Environmental Technology Verification program to provide high quality, peer reviewed data on technology performance. This data is expected to accelerate the acceptance and use of improved environmental protection technologies. The Greenhouse Gas Technology C...

  1. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  2. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    International Nuclear Information System (INIS)

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  3. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    Science.gov (United States)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  4. A Comparison of Advanced Monte Carlo Methods for Open Systems: CFCMC vs CBMC

    NARCIS (Netherlands)

    A. Torres-Knoop; S.P. Balaji; T.J.H. Vlugt; D. Dubbeldam

    2014-01-01

    Two state-of-the-art simulation methods for computing adsorption properties in porous materials like zeolites and metal-organic frameworks are compared: the configurational bias Monte Carlo (CBMC) method and the recently proposed continuous fractional component Monte Carlo (CFCMC) method. We show th

  5. An automatic dose verification system for adaptive radiotherapy for helical tomotherapy

    Science.gov (United States)

    Mo, Xiaohu; Chen, Mingli; Parnell, Donald; Olivera, Gustavo; Galmarini, Daniel; Lu, Weiguo

    2014-03-01

    verification system that quantifies treatment doses, and provides necessary information for adaptive planning without impeding clinical workflows.

  6. Dosimetric verification of IMAT delivery with a conventional EPID system and a commercial portal dose image prediction tool

    International Nuclear Information System (INIS)

    Purpose: The electronic portal imaging device (EPID) is a system for checking the patient setup; as a result of its integration with the linear accelerator and software customized for dosimetry, it is increasingly used for verification of the delivery of fixed-field intensity-modulated radiation therapy (IMRT). In order to extend such an approach to intensity-modulated arc therapy (IMAT), the combined use of an EPID system and a portal dose image prediction (PDIP) tool has been investigated. Methods: The dosimetric behavior of an EPID system, mechanically reinforced to maintain its positional stability during the accelerator gantry rotation, has been studied to assess its ability to measure portal dose distributions for IMAT treatment beams. In addition, the PDIP tool of a commercial treatment planning system, commonly used for static IMRT dosimetry, has been validated for simulating the PDIs of IMAT treatment fields. The method has been applied to the delivery verification of 23 treatment fields that were measured in their dual mode of IMRT and IMAT modalities. Results: The EPID system has proved to be appropriate for measuring the PDIs of IMAT fields; additionally the PDIP tool was able to simulate these accurately. The results are quite similar to those obtained for static IMRT treatment verification, although it was necessary to investigate the dependence of the EPID signal and of the accelerator monitor chamber response on variable dose rate. Conclusions: Our initial tests indicate that the EPID system, together with the PDIP tool, is a suitable device for the verification of IMAT plan delivery; however, additional tests are necessary to confirm these results.

  7. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    Science.gov (United States)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  8. Verification of Internal Dose Calculations.

    Science.gov (United States)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous

  9. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  10. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    Energy Technology Data Exchange (ETDEWEB)

    BRIGGS, C.R.

    2000-02-09

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the

  11. SEMI-BLIND CHANNEL ESTIMATION OF MULTIPLE-INPUT/MULTIPLE-OUTPUT SYSTEMS BASED ON MARKOV CHAIN MONTE CARLO METHODS

    Institute of Scientific and Technical Information of China (English)

    Jiang Wei; Xiang Haige

    2004-01-01

    This paper addresses the issues of channel estimation in a Multiple-Input/Multiple-Output (MIMO) system. Markov Chain Monte Carlo (MCMC) method is employed to jointly estimate the Channel State Information (CSI) and the transmitted signals. The deduced algorithms can work well under circumstances of low Signal-to-Noise Ratio (SNR). Simulation results are presented to demonstrate their effectiveness.

  12. SAN CARLOS APACHE PAPERS.

    Science.gov (United States)

    ROESSEL, ROBERT A., JR.

    THE FIRST SECTION OF THIS BOOK COVERS THE HISTORICAL AND CULTURAL BACKGROUND OF THE SAN CARLOS APACHE INDIANS, AS WELL AS AN HISTORICAL SKETCH OF THE DEVELOPMENT OF THEIR FORMAL EDUCATIONAL SYSTEM. THE SECOND SECTION IS DEVOTED TO THE PROBLEMS OF TEACHERS OF THE INDIAN CHILDREN IN GLOBE AND SAN CARLOS, ARIZONA. IT IS DIVIDED INTO THREE PARTS--(1)…

  13. Production of transgenic strawberries by temporary immersion bioreactor system and verification by TAIL-PCR

    Directory of Open Access Journals (Sweden)

    Kärenlampi Sirpa O

    2007-02-01

    Full Text Available Abstract Background Strawberry (Fragaria × ananassa is an economically important soft fruit crop with polyploid genome which complicates the breeding of new cultivars. For certain traits, genetic engineering offers a potential alternative to traditional breeding. However, many strawberry varieties are quite recalcitrant for Agrobacterium-mediated transformation, and a method allowing easy handling of large amounts of starting material is needed. Also the genotyping of putative transformants is challenging since the isolation of DNA for Southern analysis is difficult due to the high amount of phenolic compounds and polysaccharides that complicate efficient extraction of digestable DNA. There is thus a need to apply a screening method that is sensitive and unambiguous in identifying the different transformation events. Results Hygromycin-resistant strawberries were developed in temporary immersion bioreactors by Agrobacterium-mediated gene transfer. Putative transformants were screened by TAIL-PCR to verify T-DNA integration and to distinguish between the individual transformation events. Several different types of border sequence arrangements were detected. Conclusion This study demonstrates that temporary immersion bioreactor system suits well for the regeneration of transgenic strawberry plants as a labour-efficient technique. Small amount of DNA required by TAIL-PCR is easily recovered even from a small transformant, which allows rapid verification of T-DNA integration and detection of separate gene transfer events. These techniques combined clearly facilitate the generation of transgenic strawberries but should be applicable to other plants as well.

  14. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  15. Enabling the usage of UML in the verification of railway systems: The DAM-rail approach

    International Nuclear Information System (INIS)

    The need for integration of model-based verification into industrial processes has produced several attempts to define Model-Driven solutions implementing a unifying approach to system development. A recent trend is to implement tool chains supporting the developer both in the design phase and V and V activities. In this Model-Driven context, specific domains require proper modelling approaches, especially for what concerns RAM (Reliability, Availability, Maintainability) analysis and fulfillment of international standards. This paper specifically addresses the definition of a Model-Driven approach for the evaluation of RAM attributes in railway applications to automatically generate formal models. For this aim we extend the MARTE-DAM UML profile with concepts related to maintenance aspects and service degradation, and show that the MARTE-DAM framework can be successfully specialized for the railway domain. Model transformations are then defined to generate Repairable Fault Tree and Bayesian Network models from MARTE-DAM specifications. The whole process is applied to the railway domain in two different availability studies

  16. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  17. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Science.gov (United States)

    Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.

    2013-09-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.

  18. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  19. Improving Power System Risk Evaluation Method Using Monte Carlo Simulation and Gaussian Mixture Method

    Directory of Open Access Journals (Sweden)

    GHAREHPETIAN, G. B.

    2009-06-01

    Full Text Available The analysis of the risk of partial and total blackouts has a crucial role to determine safe limits in power system design, operation and upgrade. Due to huge cost of blackouts, it is very important to improve risk assessment methods. In this paper, Monte Carlo simulation (MCS was used to analyze the risk and Gaussian Mixture Method (GMM has been used to estimate the probability density function (PDF of the load curtailment, in order to improve the power system risk assessment method. In this improved method, PDF and a suggested index have been used to analyze the risk of loss of load. The effect of considering the number of generation units of power plants in the risk analysis has been studied too. The improved risk assessment method has been applied to IEEE 118 bus and the network of Khorasan Regional Electric Company (KREC and the PDF of the load curtailment has been determined for both systems. The effect of various network loadings, transmission unavailability, transmission capacity and generation unavailability conditions on blackout risk has been investigated too.

  20. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  1. Remaining Sites Verification Package for the 1607-F4 Sanitary Sewer System, Waste Site Reclassification Form 2004-131

    Energy Technology Data Exchange (ETDEWEB)

    L. M. Dittmer

    2007-12-03

    The 1607-F4 waste site is the former location of the sanitary sewer system that serviced the former 115-F Gas Recirculation Building. The system included a septic tank, drain field, and associated pipeline that were in use from 1944 to 1965. The 1607-F4 waste site received unknown amounts of sanitary sewage from the 115-F Gas Recirculation Building and may have potentially contained hazardous and radioactive contamination. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  2. COMPLEXITY OF ANALYSIS & VERIFICATION PROBLEMS FOR COMMUNICATING AUTOMATA & DISCRETE DYNAMICAL SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    H. B. HUNT; D. J. ROSENKRANTS; ET AL

    2001-03-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (i) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (ii) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (iii) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPS, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly-specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  3. Complexity of analysis and verification problems for communicating automata and discrete dynamical systems.

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, H. B. (Harry B.); Rosenkrantz, D. J. (Daniel J.); Barrett, C. L. (Christopher L.); Marathe, M. V. (Madhav V.); Ravi, S. S. (Sekharipuram S.)

    2001-01-01

    We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (1) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (2) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (3) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPs, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for

  4. Verification of AEGIS/SCOPE2, a next-generation in-core fuel management system

    International Nuclear Information System (INIS)

    AEGIS/SCOPE2 is a next-generation code system for in-core fuel management of PWRs; AEGIS is a 2-D lattice code which treats heterogeneous geometry based on the MOC, while SCOPE2 is a highly efficient parallel code which performs multi-group nodal-transport calculations in 3-D pin-by-pin geometry. Cross sections for SCOPE2 calculations are provided by AEGIS. In this paper, a preliminary result of numerical performance by the AEGIS/SCOPE2 system is presented. In assembly calculations, prediction results by AEGIS were compared with reference results by MVP, a continuous-energy Monte-Carlo code, for k∞ and fission rate distributions within an assembly. Good agreement between the codes was observed. A preliminary result of burnup calculation is also presented with comparisons of k∞ between AEGIS and MVP-burn, a burnup code coupled with MVP. AEGIS predicted k∞ within ±0.2 %Δk/k throughout burnup up to 60 GWd/t compared to the reference. An initial core of a commercial PWR at HZP was analyzed with AEGIS/SCOPE2 using nuclear data libraries including ENDF-B/VI rev. 8, B/VTI beta 0 and JENDL-3.3. In this preliminary study, the criticality was a little underestimated, however assembly-wise power distribution was predicted in good accuracy. (authors)

  5. HEXANN-EVALU - a Monte Carlo program system for pressure vessel neutron irradiation calculation

    International Nuclear Information System (INIS)

    The Monte Carlo program HEXANN and the evaluation program EVALU are intended to calculate Monte Carlo estimates of reaction rates and currents in segments of concentric angular regions around a hexagonal reactor-core region. The report describes the theoretical basis, structure and activity of the programs. Input data preparation guides and a sample problem are also included. Theoretical considerations as well as numerical experimental results suggest the user a nearly optimum way of making use of the Monte Carlo efficiency increasing options included in the program

  6. Monte Carlo simulations of morphological transitions in PbTe/CdTe immiscible material systems

    Science.gov (United States)

    Mińkowski, Marcin; Załuska-Kotur, Magdalena A.; Turski, Łukasz A.; Karczewski, Grzegorz

    2016-09-01

    The crystal growth of the immiscible PbTe/CdTe multilayer system is analyzed as an example of a self-organizing process. The immiscibility of the constituents leads to the observed morphological transformations such as an anisotropy driven formation of quantum dots and nanowires and to a phase separation at the highest temperatures. The proposed model accomplishes a bulk and surface diffusion together with an anisotropic mobility of the material components. We analyze its properties by kinetic Monte Carlo simulations and show that it is able to reproduce all of the structures observed experimentally during the process of the PbTe/CdTe growth. We show that all of the dynamical processes studied play an important role in the creation of zero-, one-, two-, and, finally, three-dimensional structures. The shape of the structures that are grown is different for relatively thick multilayers, when the bulk diffusion cooperates with the anisotropic mobility, as compared to the annealed structures for which only the isotropic bulk diffusion decides about the process. Finally, it is different again for thin multilayers when the surface diffusion is the most decisive factor. We compare our results with the experimentally grown systems and show that the proposed model explains the diversity of observed structures.

  7. System for verification in situ of current transformers in high voltage substations; Sistema para verificacao in situ de transformadores de corrente em substacoes de alta tensao

    Energy Technology Data Exchange (ETDEWEB)

    Mendonca, Pedro Henrique; Costa, Marcelo M. da; Dahlke, Diogo B.; Ikeda, Minoru [LACTEC - Instituto de Tecnologia para o Desenvolvimento, Curitiba, PR (Brazil)], Emails: pedro.henrique@lactec.org.br, arinos@lactec.org.br, diogo@lactec.org.br, minoru@lactec.org.br, Celso.melo@copel.com; Carvalho, Joao Claudio D. de [ELETRONORTE, Belem, PR (Brazil)], E-mail: marcelo.melo@eln.gov.br; Teixeira Junior, Jose Arinos [ELETROSUL, Florianopolis, SC (Brazil)], E-mail: jclaudio@eletrosul.gov.br; Melo, Celso F. [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)], E-mail: Celso.melo@copel.com

    2009-07-01

    This work presents an alternative proposal to the execute the calibration of conventional current transformer at the field, using a verification system composed by a optical current transformer as a reference standard, able to installation in extra high voltage bars.

  8. Clinical commissioning of an in vivo range verification system for prostate cancer treatment with anterior and anterior oblique proton beams.

    Science.gov (United States)

    Hoesl, M; Deepak, S; Moteabbed, M; Jassens, G; Orban, J; Park, Y K; Parodi, K; Bentefour, E H; Lu, H M

    2016-04-21

    The purpose of this work is the clinical commissioning of a recently developed in vivo range verification system (IRVS) for treatment of prostate cancer by anterior and anterior oblique proton beams. The IRVS is designed to perform a complete workflow for pre-treatment range verification and adjustment. It contains specifically designed dosimetry and electronic hardware and a specific software for workflow control with database connection to the treatment and imaging systems. An essential part of the IRVS system is an array of Si-diode detectors, designed to be mounted to the endorectal water balloon routinely used for prostate immobilization. The diodes can measure dose rate as function of time from which the water equivalent path length (WEPL) and the dose received are extracted. The former is used for pre-treatment beam range verification and correction, if necessary, while the latter is to monitor the dose delivered to patient rectum during the treatment and serves as an additional verification. The entire IRVS workflow was tested for anterior and 30 degree inclined proton beam in both solid water and anthropomorphic pelvic phantoms, with the measured WEPL and rectal doses compared to the treatment plan. Gafchromic films were also used for measurement of the rectal dose and compared to IRVS results. The WEPL measurement accuracy was in the order of 1 mm and after beam range correction, the dose received by the rectal wall were 1.6% and 0.4% from treatment planning, respectively, for the anterior and anterior oblique field. We believe the implementation of IRVS would make the treatment of prostate with anterior proton beams more accurate and reliable. PMID:27002470

  9. COG10, Multiparticle Monte Carlo Code System for Shielding and Criticality Use

    International Nuclear Information System (INIS)

    1 - Description of program or function: COG is a modern, full-featured Monte Carlo radiation transport code which provides accurate answers to complex shielding, criticality, and activation problems. COG was written to be state-of-the-art and free of physics approximations and compromises found in earlier codes. COG is fully 3-D, uses point-wise cross sections and exact angular scattering, and allows a full range of biasing options to speed up solutions for deep penetration problems. Additionally, a criticality option is available for computing Keff for assemblies of fissile materials. ENDL or ENDFB cross section libraries may be used. COG home page: http://www-phys.llnl.gov/N_Div/COG/. Cross section libraries are included in the package. COG can use either the LLNL ENDL-90 cross section set or the ENDFB/VI set. Analytic surfaces are used to describe geometric boundaries. Parts (volumes) are described by a method of Constructive Solid Geometry. Surface types include surfaces of up to fourth order, and pseudo-surfaces such as boxes, finite cylinders, and figures of revolution. Repeated assemblies need be defined only once. Parts are visualized in cross-section and perspective picture views. Source and random-walk biasing techniques may be selected to improve solution statistics. These include source angular biasing, importance weighting, particle splitting and Russian roulette, path-length stretching, point detectors, scattered direction biasing, and forced collisions. Criticality - For a fissioning system, COG will compute Keff by transporting batches of neutrons through the system. Activation - COG can compute gamma-ray doses due to neutron-activated materials, starting with just a neutron source. Coupled Problems - COG can solve coupled problems involving neutrons, photons, and electrons. 2 - Methods:COG uses Monte Carlo methods to solve the Boltzmann transport equation for particles traveling through arbitrary 3-dimensional geometries. Neutrons, photons

  10. Verification of an operational ocean circulation-surface wave coupled forecasting system for the China's seas

    Institute of Scientific and Technical Information of China (English)

    WANG Guansuo; ZHAO Chang; XU Jiangling; QIAO Fangli; XIA Changshui

    2016-01-01

    An operational ocean circulation-surface wave coupled forecasting system for the seas off China and adjacent areas (OCFS-C) is developed based on parallelized circulation and wave models. It has been in operation since November 1, 2007. In this paper we comprehensively present the simulation and verification of the system, whose distinguishing feature is that the wave-induced mixing is coupled in the circulation model. In particular, with nested technique the resolution in the China's seas has been updated to (1/24)° from the global model with (1/2)° resolution. Besides, daily remote sensing sea surface temperature (SST) data have been assimilated into the model to generate a hot restart field for OCFS-C. Moreover, inter-comparisons between forecasting and independent observational data are performed to evaluate the effectiveness of OCFS-C in upper-ocean quantities predictions, including SST, mixed layer depth (MLD) and subsurface temperature. Except in conventional statistical metrics, non-dimensional skill scores (SS) is also used to evaluate forecast skill. Observations from buoys and Argo profiles are used for lead time and real time validations, which give a large SS value (more than 0.90). Besides, prediction skill for the seasonal variation of SST is confirmed. Comparisons of subsurface temperatures with Argo profiles data indicate that OCFS-C has low skill in predicting subsurface temperatures between 100 m and 150 m. Nevertheless, inter-comparisons of MLD reveal that the MLD from model is shallower than that from Argo profiles by about 12 m, i.e., OCFS-C is successful and steady in MLD predictions. Validation of 1-d, 2-d and 3-d forecasting SST shows that our operational ocean circulation-surface wave coupled forecasting model has reasonable accuracy in the upper ocean.

  11. Conceptual detector development and Monte Carlo simulation of a novel 3D breast computed tomography system

    Science.gov (United States)

    Ziegle, Jens; Müller, Bernhard H.; Neumann, Bernd; Hoeschen, Christoph

    2016-03-01

    A new 3D breast computed tomography (CT) system is under development enabling imaging of microcalcifications in a fully uncompressed breast including posterior chest wall tissue. The system setup uses a steered electron beam impinging on small tungsten targets surrounding the breast to emit X-rays. A realization of the corresponding detector concept is presented in this work and it is modeled through Monte Carlo simulations in order to quantify first characteristics of transmission and secondary photons. The modeled system comprises a vertical alignment of linear detectors hold by a case that also hosts the breast. Detectors are separated by gaps to allow the passage of X-rays towards the breast volume. The detectors located directly on the opposite side of the gaps detect incident X-rays. Mechanically moving parts in an imaging system increase the duration of image acquisition and thus can cause motion artifacts. So, a major advantage of the presented system design is the combination of the fixed detectors and the fast steering electron beam which enable a greatly reduced scan time. Thereby potential motion artifacts are reduced so that the visualization of small structures such as microcalcifications is improved. The result of the simulation of a single projection shows high attenuation by parts of the detector electronics causing low count levels at the opposing detectors which would require a flat field correction, but it also shows a secondary to transmission ratio of all counted X-rays of less than 1 percent. Additionally, a single slice with details of various sizes was reconstructed using filtered backprojection. The smallest detail which was still visible in the reconstructed image has a size of 0.2mm.

  12. SU-E-T-35: An Investigation of the Accuracy of Cervical IMRT Dose Distribution Using 2D/3D Ionization Chamber Arrays System and Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Y; Yang, J; Liu, H [Cangzhou People' s Hospital, Cangzhou, Hebei (China); Liu, D [The Fourth Hospital of Hebei Medical University, Shijiazhuang, Hebei (China)

    2014-06-01

    Purpose: The purpose of this work is to compare the verification results of three solutions (2D/3D ionization chamber arrays measurement and Monte Carlo simulation), the results will help make a clinical decision as how to do our cervical IMRT verification. Methods: Seven cervical cases were planned with Pinnacle 8.0m to meet the clinical acceptance criteria. The plans were recalculated in the Matrixx and Delta4 phantom with the accurate plans parameters. The plans were also recalculated by Monte Carlo using leaf sequences and MUs for individual plans of every patient, Matrixx and Delta4 phantom. All plans of Matrixx and Delta4 phantom were delivered and measured. The dose distribution of iso slice, dose profiles, gamma maps of every beam were used to evaluate the agreement. Dose-volume histograms were also compared. Results: The dose distribution of iso slice and dose profiles from Pinnacle calculation were in agreement with the Monte Carlo simulation, Matrixx and Delta4 measurement. A 95.2%/91.3% gamma pass ratio was obtained between the Matrixx/Delta4 measurement and Pinnacle distributions within 3mm/3% gamma criteria. A 96.4%/95.6% gamma pass ratio was obtained between the Matrixx/Delta4 measurement and Monte Carlo simulation within 2mm/2% gamma criteria, almost 100% gamma pass ratio within 3mm/3% gamma criteria. The DVH plot have slightly differences between Pinnacle and Delta4 measurement as well as Pinnacle and Monte Carlo simulation, but have excellent agreement between Delta4 measurement and Monte Carlo simulation. Conclusion: It was shown that Matrixx/Delta4 and Monte Carlo simulation can be used very efficiently to verify cervical IMRT delivery. In terms of Gamma value the pass ratio of Matrixx was little higher, however, Delta4 showed more problem fields. The primary advantage of Delta4 is the fact it can measure true 3D dosimetry while Monte Carlo can simulate in patients CT images but not in phantom.

  13. On-the-fly nuclear data processing methods for Monte Carlo simulations of fast spectrum systems

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-31

    The presentation summarizes work performed over summer 2015 related to Monte Carlo simulations. A flexible probability table interpolation scheme has been implemented and tested with results comparing favorably to the continuous phase-space on-the-fly approach.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE UV DISINFECTION OF SECONDARY EFFLUENTS, SUNTEC, INC. MODEL LPX200 DISINFECTION SYSTEM - 03/09/WQPC-SWP

    Science.gov (United States)

    Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...

  15. Real-time Performance Verification of Core Protection and Monitoring System with Integrated Model for SMART Simulator

    International Nuclear Information System (INIS)

    In keeping with these purposes, a real-time model of the digital core protection and monitoring systems for simulator implementation was developed on the basis of SCOPS and SCOMS algorithms. In addition, important features of the software models were explained for the application to SMART simulator, and the real-time performance of the models linked with DLL was examined for various simulation scenarios. In this paper, performance verification of core protection and monitoring software is performed with integrated simulator model. A real-time performance verification of core protection and monitoring software for SMART simulator was performed with integrated simulator model. Various DLL connection tests were done for software algorithm change. In addition, typical accident scenarios of SMART were simulated with 3KEYMASTER and simulated results were compared with those of DLL linked core protection and monitoring software. Each calculational result showed good agreements

  16. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  17. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  18. SU-E-I-56: Scan Angle Reduction for a Limited-Angle Intrafraction Verification (LIVE) System

    International Nuclear Information System (INIS)

    Purpose: To develop a novel adaptive reconstruction strategy to further reduce the scanning angle required by the limited-angle intrafraction verification (LIVE) system for intrafraction verification. Methods: LIVE acquires limited angle MV projections from the exit fluence of the arc treatment beam or during gantry rotation between static beams. Orthogonal limited-angle kV projections are also acquired simultaneously to provide additional information. LIVE considers the on-board 4D-CBCT images as a deformation of the prior 4D-CT images, and solves the deformation field based on deformation models and data fidelity constraint. LIVE reaches a checkpoint after a limited-angle scan, and reconstructs 4D-CBCT for intrafraction verification at the checkpoint. In adaptive reconstruction strategy, a larger scanning angle of 30° is used for the first checkpoint, and smaller scanning angles of 15° are used for subsequent checkpoints. The onboard images reconstructed at the previous adjacent checkpoint are used as the prior images for reconstruction at the current checkpoint. As the algorithm only needs to reconstruct the small deformation occurred between adjacent checkpoints, projections from a smaller scan angle provide enough information for the reconstruction. XCAT was used to simulate tumor motion baseline drift of 2mm along sup-inf direction at every subsequent checkpoint, which are 15° apart. Adaptive reconstruction strategy was used to reconstruct the images at each checkpoint using orthogonal 15° kV and MV projections. Results: Results showed that LIVE reconstructed the tumor volumes accurately using orthogonal 15° kV-MV projections. Volume percentage differences (VPDs) were within 5% and center of mass shifts (COMS) were within 1mm for reconstruction at all checkpoints. Conclusion: It's feasible to use an adaptive reconstruction strategy to further reduce the scan angle needed by LIVE to allow faster and more frequent intrafraction verification to minimize the

  19. Controlling the long-range corrections in atomistic Monte Carlo simulations of two-phase systems.

    Science.gov (United States)

    Goujon, Florent; Ghoufi, Aziz; Malfreyt, Patrice; Tildesley, Dominic J

    2015-10-13

    The long-range correction to the surface tension can amount to up to 55% of the calculated value of the surface tension for cutoffs in the range of 2.1-6.4 σ. The calculation of the long-range corrections to the surface tension and to the configurational energy in two-phase systems remains an active area of research. In this work, we compare the long-range corrections methods proposed by Guo and Lu ( J. Chem. Phys. 1997 , 106 , 3688 - 3695 ) and Janeček ( J. Phys. Chem. B 2006 , 110 , 6264 - 6269 ) for the calculation of the surface tension and of the coexisting densities in Monte Carlo simulations of the truncated Lennard-Jones potential and the truncated and shifted Lennard-Jones potential models. These methods require an estimate of the long-range correction at each step in the Monte Carlo simulation. We apply the full version of the Guo and Lu method, which involves the calculation of a double integral that contains a series of density differences, and we compare these results with the simplified version of the method which is routinely used in two-phase simulations. We conclude that the cutoff dependencies of the surface tension and coexisting densities are identical for the full versions of Guo and Lu and Janeček methods. We show that it is possible to avoid applying the long-range correction at every step by using the truncated Lennard-Jones potential with a cutoff rc ≥ 5 σ. The long-range correction can then be applied at the end of the simulation. The limiting factor in the accurate calculation of this final correction is an accurate estimate of the coexisting densities. Link-cell simulations performed using a cutoff rc = 5.5 σ require twice as much computing time as those with a more typical cutoff of rc = 3.0 σ. The application of the Janeček correction increases the running time of the simulation by less than 10%, and it can be profitably applied with the shorter cutoff. PMID:26574249

  20. Towards Improving Validation, Verification, Crash Investigations, and Event Reconstruction of Flight-Critical Systems with Self-Forensics

    CERN Document Server

    Mokhov, Serguei A

    2009-01-01

    This paper introduces a novel concept of self-forensics to complement the standard autonomic self-CHOP properties of the self-managed systems, to be specified in the Forensic Lucid language. We argue that self-forensics, with the forensics taken out of the cybercrime domain, is applicable to "self-dissection" for the purpose of verification of autonomous software and hardware systems of flight-critical systems for automated incident and anomaly analysis and event reconstruction by the engineering teams in a variety of incident scenarios during design and testing as well as actual flight data.

  1. Runtime Instrumentation of SystemC/TLM2 Interfaces for Fault Tolerance Requirements Verification in Software Cosimulation

    Directory of Open Access Journals (Sweden)

    Antonio da Silva

    2014-01-01

    Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.

  2. REQUIREMENT ANALYSIS, ARCHITECTURAL DESIGN AND FORMAL VERIFICATION OF A MULTI-AGENT BASED UNIVERSITY INFORMATION MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Nadeem AKHTAR

    2014-12-01

    Full Text Available This paper presents an approach based on the analysis, design, and formal verification of a multi-agent based university Information Management System (IMS. University IMS accesses information, creates reports and facilitates teachers as well as students. An orchestrator agent manages the coordination between all agents. It also manages the database connectivity for the whole system. The proposed IMS is based on BDI agent architecture, which models the system based on belief, desire, and intentions. The correctness properties of safety and liveness are specified by First-order predicate logic.

  3. Monte Carlo studies and optimization for the calibration system of the GERDA experiment

    Science.gov (United States)

    Baudis, L.; Ferella, A. D.; Froborg, F.; Tarka, M.

    2013-11-01

    The GERmanium Detector Array, GERDA, searches for neutrinoless double β decay in 76Ge using bare high-purity germanium detectors submerged in liquid argon. For the calibration of these detectors γ emitting sources have to be lowered from their parking position on the top of the cryostat over more than 5 m down to the germanium crystals. With the help of Monte Carlo simulations, the relevant parameters of the calibration system were determined. It was found that three 228Th sources with an activity of 20 kBq each at two different vertical positions will be necessary to reach sufficient statistics in all detectors in less than 4 h of calibration time. These sources will contribute to the background of the experiment with a total of (1.07±0.04(stat)-0.19+0.13(sys))×10-4 cts/(keV kg yr)) when shielded from below with 6 cm of tantalum in the parking position.

  4. Monte Carlo studies and optimization for the calibration system of the GERDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Baudis, L. [Physics Institute, University of Zurich, Winterthurerstrasse 190, 8057 Zürich (Switzerland); Ferella, A.D. [Physics Institute, University of Zurich, Winterthurerstrasse 190, 8057 Zürich (Switzerland); INFN Laboratori Nazionali del Gran Sasso, 67010 Assergi (Italy); Froborg, F., E-mail: francis@froborg.de [Physics Institute, University of Zurich, Winterthurerstrasse 190, 8057 Zürich (Switzerland); Tarka, M. [Physics Institute, University of Zurich, Winterthurerstrasse 190, 8057 Zürich (Switzerland); Physics Department, University of Illinois, 1110 West Green Street, Urbana, IL 61801 (United States)

    2013-11-21

    The GERmanium Detector Array, GERDA, searches for neutrinoless double β decay in {sup 76}Ge using bare high-purity germanium detectors submerged in liquid argon. For the calibration of these detectors γ emitting sources have to be lowered from their parking position on the top of the cryostat over more than 5 m down to the germanium crystals. With the help of Monte Carlo simulations, the relevant parameters of the calibration system were determined. It was found that three {sup 228}Th sources with an activity of 20 kBq each at two different vertical positions will be necessary to reach sufficient statistics in all detectors in less than 4 h of calibration time. These sources will contribute to the background of the experiment with a total of (1.07±0.04(stat){sub −0.19}{sup +0.13}(sys))×10{sup −4}cts/(keVkgyr)) when shielded from below with 6 cm of tantalum in the parking position.

  5. Dosimetric evaluation of a Monte Carlo IMRT treatment planning system incorporating the MIMiC

    Science.gov (United States)

    Rassiah-Szegedi, P.; Fuss, M.; Sheikh-Bagheri, D.; Szegedi, M.; Stathakis, S.; Lancaster, J.; Papanikolaou, N.; Salter, B.

    2007-12-01

    The high dose per fraction delivered to lung lesions in stereotactic body radiation therapy (SBRT) demands high dose calculation and delivery accuracy. The inhomogeneous density in the thoracic region along with the small fields used typically in intensity-modulated radiation therapy (IMRT) treatments poses a challenge in the accuracy of dose calculation. In this study we dosimetrically evaluated a pre-release version of a Monte Carlo planning system (PEREGRINE 1.6b, NOMOS Corp., Cranberry Township, PA), which incorporates the modeling of serial tomotherapy IMRT treatments with the binary multileaf intensity modulating collimator (MIMiC). The aim of this study is to show the validation process of PEREGRINE 1.6b since it was used as a benchmark to investigate the accuracy of doses calculated by a finite size pencil beam (FSPB) algorithm for lung lesions treated on the SBRT dose regime via serial tomotherapy in our previous study. Doses calculated by PEREGRINE were compared against measurements in homogeneous and inhomogeneous materials carried out on a Varian 600C with a 6 MV photon beam. Phantom studies simulating various sized lesions were also carried out to explain some of the large dose discrepancies seen in the dose calculations with small lesions. Doses calculated by PEREGRINE agreed to within 2% in water and up to 3% for measurements in an inhomogeneous phantom containing lung, bone and unit density tissue.

  6. Quantum Monte Carlo for large chemical systems: implementing efficient strategies for peta scale platforms and beyond

    International Nuclear Information System (INIS)

    Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC-Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC-Chem has been shown to be capable of running at the peta scale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exa scale platforms with a comparable level of efficiency is expected to be feasible. (authors)

  7. Quantum Monte Carlo for large chemical systems: implementing efficient strategies for petascale platforms and beyond.

    Science.gov (United States)

    Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William

    2013-04-30

    Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC=Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC=Chem has been shown to be capable of running at the petascale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exascale platforms with a comparable level of efficiency is expected to be feasible.

  8. Distribution power system reliability assessment using Monte Carlo simulation: optimal maintenance strategy application

    International Nuclear Information System (INIS)

    Today, the electricity sector is confronted with new challenges imposed by the deregulation of the electricity market, the international desire to reduce the greenhouse gases emissions, the development of new technologies. There is an increasing need to assess the reliability of the distribution systems. We can see a migration of methods specially used at transmission level to the distribution level. In a previous PhD, a method based on a sequential Monte Carlo simulation has been developed. The first part of this thesis deals with the study of acceleration methods. Two methods were tested, Antithetic Variates and Stratification. The best acceleration was achieved by a combination of these two methods. Then, we discussed the feasibility study of an optimization method based on reliability criteria. The chosen application was the preventive maintenance strategies optimization. We looked for the optimal number of preventive maintenance and the maximum value of failure rate when maintenance is carried out, minimising the total cost (cost of preventive maintenance, corrective maintenance and the cost of interruptions). In the end, a series of reflections related to the future development of a reliability analysis tool were presented. A modular structure of the tool is proposed to facilitate its use, and the possibility of parallel calculations for a better efficiency. (author)

  9. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.

  10. Experimental study and by Monte Carlo of a prototype of hodoscopic of fibre optics for high resolution applications; Estudio experimental y por Monte Carlo de un prototipo de hodoscopio de fibras opticas para aplicaciones de alta resolucion

    Energy Technology Data Exchange (ETDEWEB)

    Granero, D.; Blasco, J. M.; Sanchis, E.; Gonzalez, V.; Martin, J. D.; Ballester, F.; Sanchis, E.

    2013-07-01

    The purpose of this work is to test the response of a system composed of 21 scintillators radiation fibres and its electronics as proof of the validity of the System. For this it has radiated test system with a source of verification of Sr-90. In addition, performed Monte Carlo simulations of the system by comparing the results of the simulations with those obtained experimentally. Moreover taken an approximation to the behavior of a hodoscopic composed of 100 scintillators, transverse fibers between if, in proton therapy, conducting different Monte Carlo simulations. (Author)

  11. Defining an Inteligent Information System for Monitoring and Verification of Energy Management in Cities

    International Nuclear Information System (INIS)

    Improving the efficiency of energy consumption (EC) is a central theme of any energy policy. Improved energy efficiency (EE) meets three energy policy goals: security of supply, competitiveness and protection of the environment. Systematic energy management is a body of knowledge and skills based on an organizational structure that links people with assigned responsibilities, efficiency monitoring procedures and continuous measurement and improvement of energy efficiency. This body of knowledge must be supported by appropriate ICT for gathering, processing and disseminating data on EC, EE targets and information. Energy Management Information System - EMIS is a web application for monitoring and analysis of energy and water consumption in public buildings and represents inevitable tool for systematic energy management. EMIS software tool connects processes of gathering data on buildings and their energy consumption, monitoring consumption indicators, setting energy efficiency targets and reporting energy and water consumption savings. Project Intelligent Information System for Monitoring and Verification of Energy Management in Cities (ISEMIC) will distribute EMIS software tool in region (BiH, Slovenia and Serbia). This project also has a goal of improving a software system for utilizing EC measurements, both from smart meters and traditional measurement devices and subsequent data processing and analysis to facilitate, upgrade and eventually replace the currently used energy management system for public buildings in Croatia. ISEMIC will enable use of smart meters within an energy management for the first time in BiH, Slovenia and Serbia, along with an analytical part which enables intelligent estimation of energy consumption based on multiple criteria. EMIS/ISEMIC will enable: Continuous updating and maintenance of a database of information on buildings; Continuous entry and monitoring of consumption data for all energents and water in buildings; Calculation of

  12. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    International Nuclear Information System (INIS)

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V ampersand V methodology for expert systems is presented based on three factors: (1) a system's judged need for V ampersand V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested

  13. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with ``conventional`` HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  14. Case studies of thermal energy storage (TES) systems: Evaluation and verification of system performance

    Energy Technology Data Exchange (ETDEWEB)

    Akbari, H.; Sezgen, O.

    1992-01-01

    We have developed two case studies to review and analyze energy performance of thermal energy storage CMS systems in commercial buildings. Our case studies considered two partial ice storage systems in Northern California. For each case, we compiled historical data on TES design, installation, and operation. This information was further enhanced by data obtained through interviews with the building owners and operators. The performance and historical data of the TES systems and their components were grouped into issues related to design, installation, operation, and maintenance of the systems. Our analysis indicated that (1) almost all problems related to the operation of TES and non-TES systems could be traced back to the design of the system, and (2) the identified problems were not unique to the TES systems. There were as many original problems with conventional'' HVAC systems and components as with TES systems. Judging from the problems related to non-TES components identified in these two case studies, it is reasonable to conclude that conventional systems have as many problems as TES systems, but a failure, in a TES system may have a more dramatic impact on thermal comfort and electricity charges. The objective of the designers of the TES systems in the case-study buildings was to design just-the-right-size systems so that both the initial investment and operating costs would be minimized. Given such criteria, a system is typically designed only for normal and steady-state operating conditions-which often precludes due consideration to factors such as maintenance, growth in the needed capacity, ease of the operation, and modularity of the systems. Therefore, it is not surprising to find that these systems, at least initially, did not perform to the design intent and expectation and that they had to go through extended periods of trouble-shooting.

  15. A Particle System for Safety Verification of Free Flight in Air Traffic

    NARCIS (Netherlands)

    Blom, H.A.P.; Krystul, J.; Bakker, G.J.

    2006-01-01

    Under free flight, an aircrew has both the freedom to select their trajectory and the responsibility of resolving conflicts with other aircraft. The general belief is that free flight can be made safe under low traffic conditions. Increasing traffic, however, raises safety verification issues. This

  16. Development of an airborne remote sensing system for crop pest management: System integration and verification

    Science.gov (United States)

    Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology has been developed, which scientists can implement to help farmers maximize the economic and environmental benefits of crop pest management through precision agriculture. Airborne remo...

  17. Remaining Sites Verification Package for the 1607-D4 Septic System, Waste Site Reclassification Form 2005-036

    Energy Technology Data Exchange (ETDEWEB)

    R. A. Carlson

    2006-02-23

    The 1607-D4 Septic System was a septic tank and tile field that received sanitary sewage from the 115-D/DR Gas Recirculation Facility. This septic system operated from 1944 to 1968. Decommissioning took place in 1985 and 1986 when all above-grade features were demolished and the tank backfilled. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.

  18. Guidelines for the verification and validation of expert system software and conventional software. Volume 1: Project summary. Final report

    International Nuclear Information System (INIS)

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (AI) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally

  19. Camera selection for real-time in vivo radiation treatment verification systems using Cherenkov imaging

    Energy Technology Data Exchange (ETDEWEB)

    Andreozzi, Jacqueline M., E-mail: Jacqueline.M.Andreozzi.th@dartmouth.edu; Glaser, Adam K. [Thayer School of Engineering, Dartmouth College, Hanover, New Hampshire 03755 (United States); Zhang, Rongxiao [Department of Physics and Astronomy, Dartmouth College, Hanover, New Hampshire 03755 (United States); Jarvis, Lesley A.; Gladstone, David J. [Department of Medicine, Geisel School of Medicine and Norris Cotton Cancer Center, Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire 03766 (United States); Pogue, Brian W., E-mail: Brian.W.Pogue@dartmouth.edu [Thayer School of Engineering and Department of Physics and Astronomy, Dartmouth College, Hanover, New Hampshire 03755 (United States)

    2015-02-15

    Purpose: To identify achievable camera performance and hardware needs in a clinical Cherenkov imaging system for real-time, in vivo monitoring of the surface beam profile on patients, as novel visual information, documentation, and possible treatment verification for clinicians. Methods: Complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), intensified charge-coupled device (ICCD), and electron multiplying-intensified charge coupled device (EM-ICCD) cameras were investigated to determine Cherenkov imaging performance in a clinical radiotherapy setting, with one emphasis on the maximum supportable frame rate. Where possible, the image intensifier was synchronized using a pulse signal from the Linac in order to image with room lighting conditions comparable to patient treatment scenarios. A solid water phantom irradiated with a 6 MV photon beam was imaged by the cameras to evaluate the maximum frame rate for adequate Cherenkov detection. Adequate detection was defined as an average electron count in the background-subtracted Cherenkov image region of interest in excess of 0.5% (327 counts) of the 16-bit maximum electron count value. Additionally, an ICCD and an EM-ICCD were each used clinically to image two patients undergoing whole-breast radiotherapy to compare clinical advantages and limitations of each system. Results: Intensifier-coupled cameras were required for imaging Cherenkov emission on the phantom surface with ambient room lighting; standalone CMOS and CCD cameras were not viable. The EM-ICCD was able to collect images from a single Linac pulse delivering less than 0.05 cGy of dose at 30 frames/s (fps) and pixel resolution of 512 × 512, compared to an ICCD which was limited to 4.7 fps at 1024 × 1024 resolution. An intensifier with higher quantum efficiency at the entrance photocathode in the red wavelengths [30% quantum efficiency (QE) vs previous 19%] promises at least 8.6 fps at a resolution of 1024 × 1024 and lower monetary

  20. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system

    International Nuclear Information System (INIS)

    Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm3 and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45 000

  1. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Jiasen, E-mail: ma.jiasen@mayo.edu; Beltran, Chris; Seum Wan Chan Tseung, Hok; Herman, Michael G. [Department of Radiation Oncology, Division of Medical Physics, Mayo Clinic, 200 First Street Southwest, Rochester, Minnesota 55905 (United States)

    2014-12-15

    Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm{sup 3} and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45

  2. Photogrammetric calibration of a C-arm X-ray system as a verification tool for orthopaedic navigation systems

    Science.gov (United States)

    Broers, H.; Hemken, H.; Luhmann, T.; Ritschl, P.

    For the total replacement of the knee joint, the precise reconstruction of the mechanical axis is significantly determined by the alignment of the cutting tool with respect to the rotation centre of the femur head. Operation techniques supported by navigation allow for the precise three-dimensional location of the hip centre by cinematic analysis. Recent results permit the reconstruction of the femur axis to be better than 0.7°. Therefore, conventional verification methods such as the post-operative recording of the complete leg are not suitable due to their limited system accuracy of about 2°. As the femur head cannot be accessed directly during the operation, an X-ray method has been used to verify alignment. The paper presents a method and the results achieved for the calibration of a C-arm system by introducing photogrammetric parameters. Since the method is used during operation, boundary conditions such as minimal invasive surgical intervention and sterility have been considered for practical applications of patients.

  3. Diffusion microscopist simulator - The development and application of a Monte Carlo simulation system for diffusion MRI

    International Nuclear Information System (INIS)

    Diffusion magnetic resonance imaging (dMRI) has made a significant breakthrough in neurological disorders and brain research thanks to its exquisite sensitivity to tissue cyto-architecture. However, as the water diffusion process in neuronal tissues is a complex biophysical phenomena at molecular scale, it is difficult to infer tissue microscopic characteristics on a voxel scale from dMRI data. The major methodological contribution of this thesis is the development of an integrated and generic Monte Carlo simulation framework, 'Diffusion Microscopist Simulator' (DMS), which has the capacity to create 3D biological tissue models of various shapes and properties, as well as to synthesize dMRI data for a large variety of MRI methods, pulse sequence design and parameters. DMS aims at bridging the gap between the elementary diffusion processes occurring at a micrometric scale and the resulting diffusion signal measured at millimetric scale, providing better insights into the features observed in dMRI, as well as offering ground-truth information for optimization and validation of dMRI acquisition protocols for different applications. We have verified the performance and validity of DMS through various benchmark experiments, and applied to address particular research topics in dMRI. Based on DMS, there are two major application contributions in this thesis. First, we use DMS to investigate the impact of finite diffusion gradient pulse duration (delta) on fibre orientation estimation in dMRI. We propose that current practice of using long delta, which is enforced by the hardware limitation of clinical MRI scanners, is actually beneficial for mapping fibre orientations, even though it violates the underlying assumption made in q-space theory. Second, we employ DMS to investigate the feasibility of estimating axon radius using a clinical MRI system. The results suggest that the algorithm for mapping the direct microstructures is applicable to dMRI data acquired from

  4. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  5. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  6. The Overview of System Maintainability Verification%系统维修性验证概述

    Institute of Scientific and Technical Information of China (English)

    钱潜; 单志伟; 刘福胜

    2015-01-01

    在分析系统维修性验证概念模型的基础上,论述国内外研究现状,分析了当前主要的维修性验证方法和维修性样本获取方法,并重点说明了虚拟仿真技术在维修性验证中的应用,明确了现阶段研究存在的不足,强调了进行更深入研究的价值。%In the base of analyzing the maintainability verification concept model, this article discussed the research status at home and abroad, analyzed the validation methods and learning samples of the maintainability verification, and focused on the virtual simulation technology in maintenance used in the verification, cleared the shortage of the status of research, emphasized the value of further research.

  7. Estimating Rare Event Probabilities in Large Scale Stochastic Hybrid Systems by Sequential Monte Carlo Simulation

    NARCIS (Netherlands)

    Blom, H.A.P.; Krystul, J.; Bakker, G.J.

    2006-01-01

    We study the problem of estimating small reachability probabilities for large scale stochastic hybrid processes through Sequential Monte Carlo (SMC) simulation. Recently, [Cerou et al., 2002, 2005] developed an SMC approach for diffusion processes, and referred to the resulting SMC algorithm as an I

  8. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  9. Study of the validity of a combined potential model using the Hybrid Reverse Monte Carlo method in Fluoride glass system

    Directory of Open Access Journals (Sweden)

    M. Kotbi

    2013-03-01

    Full Text Available The choice of appropriate interaction models is among the major disadvantages of conventional methods such as Molecular Dynamics (MD and Monte Carlo (MC simulations. On the other hand, the so-called Reverse Monte Carlo (RMC method, based on experimental data, can be applied without any interatomic and/or intermolecular interactions. The RMC results are accompanied by artificial satellite peaks. To remedy this problem, we use an extension of the RMC algorithm, which introduces an energy penalty term into the acceptance criteria. This method is referred to as the Hybrid Reverse Monte Carlo (HRMC method. The idea of this paper is to test the validity of a combined potential model of coulomb and Lennard-Jones in a Fluoride glass system BaMnMF7 (M = Fe,V using HRMC method. The results show a good agreement between experimental and calculated characteristics, as well as a meaningful improvement in partial pair distribution functions (PDFs. We suggest that this model should be used in calculating the structural properties and in describing the average correlations between components of fluoride glass or a similar system. We also suggest that HRMC could be useful as a tool for testing the interaction potential models, as well as for conventional applications.

  10. Development of a practical fuel management system for PSBR based on advanced three-dimensional Monte Carlo coupled depletion methodology

    Science.gov (United States)

    Tippayakul, Chanatip

    The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was

  11. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    International Nuclear Information System (INIS)

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  12. Analytical, experimental, and Monte Carlo system response matrix for pinhole SPECT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Spain and Grupo de Imaxe Molecular, IDIS, Santiago de Compostela 15706 (Spain); Pino, Francisco [Unitat de Biofísica, Facultat de Medicina, Universitat de Barcelona, Spain and Servei de Física Médica i Protecció Radiológica, Institut Catalá d' Oncologia, Barcelona 08036 (Spain); Silva-Rodríguez, Jesús [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Santiago de Compostela 15706 (Spain); Pavía, Javier [Servei de Medicina Nuclear, Hospital Clínic, Barcelona (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ros, Doménec [Unitat de Biofísica, Facultat de Medicina, Casanova 143 (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ruibal, Álvaro [Servicio Medicina Nuclear, CHUS (Spain); Grupo de Imaxe Molecular, Facultade de Medicina (USC), IDIS, Santiago de Compostela 15706 (Spain); Fundación Tejerina, Madrid (Spain); and others

    2014-03-15

    Purpose: To assess the performance of two approaches to the system response matrix (SRM) calculation in pinhole single photon emission computed tomography (SPECT) reconstruction. Methods: Evaluation was performed using experimental data from a low magnification pinhole SPECT system that consisted of a rotating flat detector with a monolithic scintillator crystal. The SRM was computed following two approaches, which were based on Monte Carlo simulations (MC-SRM) and analytical techniques in combination with an experimental characterization (AE-SRM). The spatial response of the system, obtained by using the two approaches, was compared with experimental data. The effect of the MC-SRM and AE-SRM approaches on the reconstructed image was assessed in terms of image contrast, signal-to-noise ratio, image quality, and spatial resolution. To this end, acquisitions were carried out using a hot cylinder phantom (consisting of five fillable rods with diameters of 5, 4, 3, 2, and 1 mm and a uniform cylindrical chamber) and a custom-made Derenzo phantom, with center-to-center distances between adjacent rods of 1.5, 2.0, and 3.0 mm. Results: Good agreement was found for the spatial response of the system between measured data and results derived from MC-SRM and AE-SRM. Only minor differences for point sources at distances smaller than the radius of rotation and large incidence angles were found. Assessment of the effect on the reconstructed image showed a similar contrast for both approaches, with values higher than 0.9 for rod diameters greater than 1 mm and higher than 0.8 for rod diameter of 1 mm. The comparison in terms of image quality showed that all rods in the different sections of a custom-made Derenzo phantom could be distinguished. The spatial resolution (FWHM) was 0.7 mm at iteration 100 using both approaches. The SNR was lower for reconstructed images using MC-SRM than for those reconstructed using AE-SRM, indicating that AE-SRM deals better with the

  13. A comprehensive system for dosimetric commissioning and Monte Carlo validation for the small animal radiation research platform

    International Nuclear Information System (INIS)

    Our group has constructed the small animal radiation research platform (SARRP) for delivering focal, kilo-voltage radiation to targets in small animals under robotic control using cone-beam CT guidance. The present work was undertaken to support the SARRP's treatment planning capabilities. We have devised a comprehensive system for characterizing the radiation dosimetry in water for the SARRP and have developed a Monte Carlo dose engine with the intent of reproducing these measured results. We find that the SARRP provides sufficient therapeutic dose rates ranging from 102 to 228 cGy min-1 at 1 cm depth for the available set of high-precision beams ranging from 0.5 to 5 mm in size. In terms of depth-dose, the mean of the absolute percentage differences between the Monte Carlo calculations and measurement is 3.4% over the full range of sampled depths spanning 0.5-7.2 cm for the 3 and 5 mm beams. The measured and computed profiles for these beams agree well overall; of note, good agreement is observed in the profile tails. Especially for the smallest 0.5 and 1 mm beams, including a more realistic description of the effective x-ray source into the Monte Carlo model may be important.

  14. Experimental and Monte Carlo evaluation of Eclipse treatment planning system for effects on dose distribution of the hip prostheses

    Energy Technology Data Exchange (ETDEWEB)

    Çatlı, Serap, E-mail: serapcatli@hotmail.com [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey); Tanır, Güneş [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey)

    2013-10-01

    The present study aimed to investigate the effects of titanium, titanium alloy, and stainless steel hip prostheses on dose distribution based on the Monte Carlo simulation method, as well as the accuracy of the Eclipse treatment planning system (TPS) at 6 and 18 MV photon energies. In the present study the pencil beam convolution (PBC) method implemented in the Eclipse TPS was compared to the Monte Carlo method and ionization chamber measurements. The present findings show that if high-Z material is used in prosthesis, large dose changes can occur due to scattering. The variance in dose observed in the present study was dependent on material type, density, and atomic number, as well as photon energy; as photon energy increased back scattering decreased. The dose perturbation effect of hip prostheses was significant and could not be predicted accurately by the PBC method for hip prostheses. The findings show that for accurate dose calculation the Monte Carlo-based TPS should be used in patients with hip prostheses.

  15. Sensitivity assessment of wide Axial Field of View PET systems via Monte Carlo simulations of NEMA-like measurements

    International Nuclear Information System (INIS)

    The sensitivity characteristics of Positron Emission Tomography (PET) systems with wide Axial Field of View (AFOV) was studied by MonteCarlo simulations complemented by an approximate analytical model, aiming at full-body human PET systems with AFOV in the order of 200 cm. Simulations were based on the GEANT4 package and followed closely the NEMA NU-2 1994 norm. The sensitivity, dominated by the solid angle, grows strongly with the AFOV and with the axial acceptance angle, while the scatter fraction is almost independent from the geometry

  16. Parallel J-W Monte Carlo Simulations of Thermal Phase Changes in Finite-size Systems

    CERN Document Server

    Radev, R

    2002-01-01

    The thermodynamic properties of 59 TeF6 clusters that undergo temperature-driven phase transitions have been calculated with a canonical J-walking Monte Carlo technique. A parallel code for simulations has been developed and optimized on SUN3500 and CRAY-T3E computers. The Lindemann criterion shows that the clusters transform from liquid to solid and then from one solid structure to another in the temperature region 60-130 K.

  17. Application of Monte Carlo Method to Phase Separation Dynamics of Complex Systems

    OpenAIRE

    Okabe, Yutaka; Miyajima, Tsukasa; Ito, Toshiro; Kawakatsu, Toshihiro

    1999-01-01

    We report the application of the Monte Carlo simulation to phase separation dynamics. First, we deal with the phase separation under shear flow. The thermal effect on the phase separation is discussed, and the anisotropic growth exponents in the late stage are estimated. Next, we study the effect of surfactants on the three-component solvents. We obtain the mixture of macrophase separation and microphase separation, and investigate the dynamics of both phase separations.

  18. Integrated verification test of Severe Accident Analysis Code SAMPSON in super Simulation 'IMPACT' system

    Energy Technology Data Exchange (ETDEWEB)

    Ujita, Hiroshi; Naitoh, Masanori [Advanced Simulation Systems Department, Nuclear Power Engineering Corporation, Tokyo (Japan); Karasawa, Hidetoshi; Miyagi, Kazumi

    1999-07-01

    The Four years of the IMPACT, 'Integrated Modular Plant Analysis and Computing Technology', project Phase 1 have been completed. The verification study of Severe Accident Analysis Code SAMPSON prototype developed in Phase 1 was conducted in two steps. First, each analysis module was run independently and analysis results were compared and verified against separate-effect test data with good results. Second, with the Simulation Supervisory System, up to 11 analysis modules were executed concurrently in the parallel environment (currently, NUPEC uses IBM-SP2 with 72 process elements), to demonstrate the code capability and integrity. The target plant was Surry as a typical PWR and the initiation events were a 10-inch cold leg failure. The analysis is divided to two cases; one is in-vessel retention analysis when the gap cooling is effective (In-vessel scenario test), the other is analysis of phenomena event is extended to ex-vessel due to the Reactor Pressure Vessel failure when the gap cooling is not sufficient (Ex-vessel scenario test). The system verification test has confirmed that the full scope of the scenarios can be analysed and phenomena occurred in scenarios can be simulated quantitatively reasonably considering the physical models used for the situation. (author)

  19. A quality assurance phantom for IMRT dose verification

    Science.gov (United States)

    Ma, C.-M.; Jiang, S. B.; Pawlicki, T.; Chen, Y.; Li, J. S.; Deng, J.; Boyer, A. L.

    2003-03-01

    This paper investigates a quality assurance (QA) phantom specially designed to verify the accuracy of dose distributions and monitor units (MU) calculated by clinical treatment planning optimization systems and by the Monte Carlo method for intensity-modulated radiotherapy (IMRT). The QA phantom is a PMMA cylinder of 30 cm diameter and 40 cm length with various bone and lung inserts. A procedure (and formalism) has been developed to measure the absolute dose to water in the PMMA phantom. Another cylindrical phantom of the same dimensions, but made of water, was used to confirm the results obtained with the PMMA phantom. The PMMA phantom was irradiated by 4, 6 and 15 MV photon beams and the dose was measured using an ionization chamber and compared to the results calculated by a commercial inverse planning system (CORVUS, NOMOS, Sewickley, PA) and by the Monte Carlo method. The results show that the dose distributions calculated by both CORVUS and Monte Carlo agreed to within 2% of dose maximum with measured results in the uniform PMMA phantom for both open and intensity-modulated fields. Similar agreement was obtained between Monte Carlo calculations and measured results with the bone and lung heterogeneity inside the PMMA phantom while the CORVUS results were 4% different. The QA phantom has been integrated as a routine QA procedure for the patient's IMRT dose verification at Stanford since 1999.

  20. Fully 3D tomographic reconstruction by Monte Carlo simulation of the system matrix in preclinical PET with iodine 124

    International Nuclear Information System (INIS)

    Immuno-PET imaging can be used to assess the pharmacokinetic in radioimmunotherapy. When using iodine-124, PET quantitative imaging is limited by physics-based degrading factors within the detection system and the object, such as the long positron range in water and the complex spectrum of gamma photons. The objective of this thesis was to develop a fully 3D tomographic reconstruction method (S(MC)2PET) using Monte Carlo simulations for estimating the system matrix, in the context of preclinical imaging with iodine-124. The Monte Carlo simulation platform GATE was used for that respect. Several complexities of system matrices were calculated, with at least a model of the PET system response function. Physics processes in the object was either neglected or taken into account using a precise or a simplified object description. The impact of modelling refinement and statistical variance related to the system matrix elements was evaluated on final reconstructed images. These studies showed that a high level of complexity did not always improve qualitative and quantitative results, owing to the high-variance of the associated system matrices. (author)

  1. Multilevel Monte Carlo methods for computing failure probability of porous media flow systems

    Science.gov (United States)

    Fagerlund, F.; Hellman, F.; Målqvist, A.; Niemi, A.

    2016-08-01

    We study improvements of the standard and multilevel Monte Carlo method for point evaluation of the cumulative distribution function (failure probability) applied to porous media two-phase flow simulations with uncertain permeability. To illustrate the methods, we study an injection scenario where we consider sweep efficiency of the injected phase as quantity of interest and seek the probability that this quantity of interest is smaller than a critical value. In the sampling procedure, we use computable error bounds on the sweep efficiency functional to identify small subsets of realizations to solve highest accuracy by means of what we call selective refinement. We quantify the performance gains possible by using selective refinement in combination with both the standard and multilevel Monte Carlo method. We also identify issues in the process of practical implementation of the methods. We conclude that significant savings in computational cost are possible for failure probability estimation in a realistic setting using the selective refinement technique, both in combination with standard and multilevel Monte Carlo.

  2. Optimization of Monte Carlo simulations

    OpenAIRE

    Bryskhe, Henrik

    2009-01-01

    This thesis considers several different techniques for optimizing Monte Carlo simulations. The Monte Carlo system used is Penelope but most of the techniques are applicable to other systems. The two mayor techniques are the usage of the graphics card to do geometry calculations, and raytracing. Using graphics card provides a very efficient way to do fast ray and triangle intersections. Raytracing provides an approximation of Monte Carlo simulation but is much faster to perform. A program was ...

  3. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  4. Integrated Safety Management System Phase 1 and 2 Verification for the Environmental Restoration Contractor Volumes 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    CARTER, R.P.

    2000-04-04

    DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.

  5. Software verification and validation methodology for advanced digital reactor protection system using diverse dual processors to prevent common mode failure

    International Nuclear Information System (INIS)

    The Advanced Digital Reactor Protection System (ADRPS) with diverse dual processors is being developed by the National Research Lab of KOPEC for ADRPS development. One of the ADRPS goals is to develop digital Plant Protection System (PPS) free of Common Mode Failure (CMF). To prevent CMF, the principle of diversity is applied to both hardware design and software design. For the hardware diversity, two different types of CPUs are used for Bistable Processor and Local Coincidence Logic Processor. The VME based Single Board Computers (SBC) are used for the CPU hardware platforms. The QNX Operating System (OS) and the VxWorks OS are used for software diversity. Rigorous Software Verification and Validation (V and V) is also required to prevent CMF. In this paper, software V and V methodology for the ADRPS is described to enhance the ADRPS software reliability and to assure high quality of the ADRPS software

  6. An X-ray scatter system for material identification in cluttered objects: A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Lakshmanan, Manu N. [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Kapadia, Anuj J., E-mail: anuj.kapadia@duke.edu [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Sahbaee, Pooyan [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Dept. of Physics, NC State University, Raleigh, NC (United States); Wolter, Scott D. [Dept. of Physics, Elon University, Elon, NC (United States); Harrawood, Brian P. [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Brady, David [Dept. of Electrical and Computer Engineering, Duke University, Durham, NC (United States); Samei, Ehsan [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Dept. of Electrical and Computer Engineering, Duke University, Durham, NC (United States)

    2014-09-15

    The analysis of X-ray scatter patterns has been demonstrated as an effective method of identifying specific materials in mixed object environments, for both biological and non-biological applications. Here we describe an X-ray scatter imaging system for material identification in cluttered objects and investigate its performance using a large-scale Monte Carlo simulation study of one-thousand objects containing a broad array of materials. The GEANT4 Monte Carlo source code for Rayleigh scatter physics was modified to model coherent scatter diffraction in bulk materials based on experimentally measured form factors for 33 materials. The simulation was then used to model coherent scatter signals from a variety of targets and clutter (background) materials in one thousand randomized objects. The resulting scatter images were used to characterize four parameters of the imaging system that affected its ability to identify target materials: (a) the arrangement of materials in the object, (b) clutter attenuation, (c) type of target material, and (d) the X-ray tube current. We found that the positioning of target materials within the object did not significantly affect their detectability; however, a strong negative correlation was observed between the target detectability and the clutter attenuation of the object. The imaging signal was also found to be relatively invariant to increases in X-ray tube current above 1 mAs for most materials considered in the study. This work is the first Monte Carlo study to our knowledge of a large population of cluttered object of an X-ray scatter imaging system for material identification and lays the foundation for large-scale studies of the effectiveness of X-ray scatter imaging systems for material identification in complex samples.

  7. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  8. Dosimetric verification of volumetric modulated arc therapy in nasopharyngeal carcinoma using COMPASS 3D patient anatomy based system

    International Nuclear Information System (INIS)

    Objective: To investigate the dosimetric performance of COMPASS system, a novel 3D quality assurance system for the verification of nasopharyngeal carcinoma volumetric modulated therapy (VMAT) treatment plan. Methods: Eight VMAT treatment plans of nasopharyngeal carcinoma patients were performed with MasterPlan, a treatment planning system (TPS), and then these treatment plans were sent to the COMPASS and MOSAIQ system, a coherent control system, respectively. Comparison of the COMPASS reconstructed dose versus TPS dose was conducted by using the dose volume-based indices:dose received by 95% volume of target (D95%), mean dose (Dmean) and γ pass rate, dose to the 1% of the spinal cord and brain stem volume (D1%), mean dose of leaf and right parotid (Dmean), and the volume received 30 Gy for left and right parotid (V30). COMPASS can reconstruct dose with the real measured delivery fluence after detector commissioning. Results: The average dose difference for the target volumes was within 1%, the difference for D95 was within 3% for most treatment plans, and the γ pass rate was higher than 95% for all target volumes. The average differences for the D1% values of spinal cord and brain stem were (4.3 ± 3.0)% and (5.9± 2.9)% respectively, and the average differences for the Dmean values of spinal cord and brain stem were (5.3 ± 3.0)% and (8.0 ± 3.5)% respectively. In general the COMPASS measured doses were all smaller than the TPS calculated doses for these two organs. The average differences of the Dmean values of the left and right parotids were (6.1± 3.1)% and (4.7 ± 4.4)% respectively, and the average differences of the V30 values of the left and right parotids were (9.4 ± 7.5 )% and (9.4 ± 9.9)% respectively. Conclusions: An ideal tool for the VMAT verification, the patient anatomy based COMPASS 3D dose verification system can check the dose difference between the real delivery and TPS calculation directly for each individual organ, either target volumes

  9. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  10. A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System

    Energy Technology Data Exchange (ETDEWEB)

    C. O. Slater; J. M. Barnes; J. O. Johnson; J.D. Drischler

    1998-10-01

    The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.

  11. Development and verification test methods for proof of hardware and software reliability of the novel LOKUS computer system - report on methods and results

    International Nuclear Information System (INIS)

    This report compiles practical experiences on qualification of the computer system LOKUS (local core monitoring system). In the Fast Breeder Reactor SNR-300, LOKUS is used to monitor the fuel element outlet tempeatures with the aim of detection cooling malfunctions before fuel pin damage occurs. The report includes the methodes of Hardware and Software development, verification and test, which are used to get the qualifying examination of independent experts. In chapter 4.5.3 the independent expert (RWTUeV Essen) reports about his proper software verifications and tests. (orig.)

  12. Experimentally validated Monte Carlo simulation of an XtRa-NaI(Tl) Compton Suppression System response.

    Science.gov (United States)

    Savva, Marilia; Anagnostakis, Marios

    2016-03-01

    In this work the response of an XtRa-NaI(Tl) Compton Suppression System is simulated using the Monte Carlo code PENELOPE. The main program PENMAIN is properly modified in order to couple two energy deposition detectors and simulate the coincidence gating. The modified main program takes into account both the active shielding and the True Coincidence phenomenon. The program is evaluated by comparing simulation results with experimental data for both non-cascade and cascade emitters and concluding that no statistically significant biases are observed. PMID:26656618

  13. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  14. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  15. SIM-RIBRAS: A Monte-Carlo simulation package for RIBRAS system

    Energy Technology Data Exchange (ETDEWEB)

    Leistenschneider, E.; Lepine-Szily, A.; Lichtenthaeler, R. [Departamento de Fisica Nuclear, Instituto de Fisica, Universidade de Sao Paulo (Brazil)

    2013-05-06

    SIM-RIBRAS is a Root-based Monte-Carlo simulation tool designed to help RIBRAS users on experience planning and experimental setup enhancing and caracterization. It is divided into two main programs: CineRIBRAS, aiming beam kinematics, and SolFocus, aiming beam optics. SIM-RIBRAS replaces other methods and programs used in the past, providing more complete and accurate results and requiring much less manual labour. Moreover, the user can easily make modifications in the codes, adequating it for specific requirements of an experiment.

  16. On-sky performance during verification and commissioning of the Gemini Planet Imager's adaptive optics system

    CERN Document Server

    Poyneer, Lisa A; Macintosh, Bruce; Palmer, David W; Perrin, Marshall D; Sadakuni, Naru; Savransky, Dmitry; Bauman, Brian; Cardwell, Andrew; Chilcote, Jeffrey K; Dillon, Daren; Gavel, Donald; Goodsell, Stephen J; Hartung, Markus; Hibon, Pascale; Rantakyro, Fredrik T; Thomas, Sandrine; Veran, Jean-Pierre

    2014-01-01

    The Gemini Planet Imager instrument's adaptive optics (AO) subsystem was designed specifically to facilitate high-contrast imaging. It features several new technologies, including computationally efficient wavefront reconstruction with the Fourier transform, modal gain optimization every 8 seconds, and the spatially filtered wavefront sensor. It also uses a Linear-Quadratic-Gaussian (LQG) controller (aka Kalman filter) for both pointing and focus. We present on-sky performance results from verification and commissioning runs from December 2013 through May 2014. The efficient reconstruction and modal gain optimization are working as designed. The LQG controllers effectively notch out vibrations. The spatial filter can remove aliases, but we typically use it oversized by about 60% due to stability problems.

  17. Attribute verification systems with information barriers for classified forms of plutonium in the trilateral initiative

    International Nuclear Information System (INIS)

    A team of technical experts from the Russian Federation, the International Atomic Energy Agency (IAEA), and the United States has been working since December 1997 to develop a toolkit of instruments that could be used to verify plutonium-bearing items that have classified characteristics in nuclear weapons states. This suite of instruments is similar in many ways to standard safeguards equipment and includes high-resolution gamma-ray spectrometers, neutron multiplicity counters, gross neutron counters, and gross gamma-ray detectors. In safeguards applications, this equipment is known to be robust and authentication methods are well understood. However, this equipment is very intrusive, and a traditional safeguards application of such equipment for verification of materials with classified characteristics would reveal classified information to the inspector. Several enabling technologies have been or are being developed to facilitate the use of these trusted, but intrusive safeguards technologies. In this paper, these new technologies will be described. (author)

  18. Age and gender-invariant features of handwritten signatures for verification systems

    Science.gov (United States)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  19. Monte-Carlo optimisation of a Compton suppression system for use with a broad-energy HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Britton, R., E-mail: r.britton@surrey.ac.uk [University of Surrey, Guildford GU2 7XH (United Kingdom); AWE, Aldermaston, Reading, Berkshire RG7 4PR (United Kingdom); Burnett, J.L.; Davies, A.V. [AWE, Aldermaston, Reading, Berkshire RG7 4PR (United Kingdom); Regan, P.H. [University of Surrey, Guildford GU2 7XH (United Kingdom)

    2014-10-21

    Monte-Carlo simulations are used to evaluate and optimise multiple components of a Compton Suppression System based upon a Broad-energy HPGe primary detector. Several materials for the secondary crystal are evaluated, including NaI(Tl), BGO and LaBr{sub 3}(Ce). BGO was found to be the most effective across the required energy range, with the sizes of the proposed veto detector then optimised to extract the maximum performance for a given volume of material. Suppression factors are calculated for a range of nuclides (both single and cascade emitters) with improvements of 2 for the Compton Suppression Factors, and 10 for the continuum reduction when compared to the Compton suppression system currently in use. This equates to a reduction in the continuum by up to a factor of ∼240 for radionuclides such as {sup 60}Co, which is crucial for the detection of low-energy, low-activity γ emitters typically swamped by such a continuum. -- Highlights: •Monte Carlo simulations utilised to design and optimise a Compton Suppression system. •NaI(Tl), LaBr(Ce), and BGO materials are evaluated for their effectiveness as veto. •Photon tracking routine is developed to identify where photons typically scatter. •A 3 component BGO based veto is optimised for use with a planar HPGe detector. •Continuum of Co-60 reduced by <240 times, a 10 fold improvement on existing design.

  20. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    Directory of Open Access Journals (Sweden)

    L. Foresti

    2015-07-01

    Full Text Available The Short-Term Ensemble Prediction System (STEPS is implemented in real-time at the Royal Meteorological Institute (RMI of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE. STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60–90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80–90 % of the forecast errors.