WorldWideScience

Sample records for carlo verification system

  1. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  2. Monte Carlo systems used for treatment planning and dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo

  3. A preliminary study of in-house Monte Carlo simulations: an integrated Monte Carlo verification system.

    Science.gov (United States)

    Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki

    2009-10-01

    To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.

  4. Experimental verification of lung dose with radiochromic film: comparison with Monte Carlo simulations and commercially available treatment planning systems

    Science.gov (United States)

    Paelinck, L.; Reynaert, N.; Thierens, H.; DeNeve, W.; DeWagter, C.

    2005-05-01

    The purpose of this study was to assess the absorbed dose in and around lung tissue by performing radiochromic film measurements, Monte Carlo simulations and calculations with superposition convolution algorithms. We considered a layered polystyrene phantom of 12 × 12 × 12 cm3 containing a central cavity of 6 × 6 × 6 cm3 filled with Gammex RMI lung-equivalent material. Two field configurations were investigated, a small 1 × 10 cm2 field and a larger 10 × 10 cm2 field. First, we performed Monte Carlo simulations to investigate the influence of radiochromic film itself on the measured dose distribution when the film intersects a lung-equivalent region and is oriented parallel to the central beam axis. To that end, the film and the lung-equivalent materials were modelled in detail, taking into account their specific composition. Next, measurements were performed with the film oriented both parallel and perpendicular to the central beam axis to verify the results of our Monte Carlo simulations. Finally, we digitized the phantom in two commercially available treatment planning systems, Helax-TMS version 6.1A and Pinnacle version 6.2b, and calculated the absorbed dose in the phantom with their incorporated superposition convolution algorithms to compare with the Monte Carlo simulations. Comparing Monte Carlo simulations with measurements reveals that radiochromic film is a reliable dosimeter in and around lung-equivalent regions when the film is positioned perpendicular to the central beam axis. Radiochromic film is also able to predict the absorbed dose accurately when the film is positioned parallel to the central beam axis through the lung-equivalent region. However, attention must be paid when the film is not positioned along the central beam axis, in which case the film gradually attenuates the beam and decreases the dose measured behind the cavity. This underdosage disappears by offsetting the film a few centimetres. We find deviations of about 3.6% between

  5. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  6. Simulation of digital pixel readout chip architectures with the RD53 SystemVerilog-UVM verification environment using Monte Carlo physics data

    International Nuclear Information System (INIS)

    Conti, E.; Marconi, S.; Christiansen, J.; Placidi, P.; Hemperek, T.

    2016-01-01

    The simulation and verification framework developed by the RD53 collaboration is a powerful tool for global architecture optimization and design verification of next generation hybrid pixel readout chips. In this paper the framework is used for studying digital pixel chip architectures at behavioral level. This is carried out by simulating a dedicated, highly parameterized pixel chip description, which makes it possible to investigate different grouping strategies between pixels and different latency buffering and arbitration schemes. The pixel hit information used as simulation input can be either generated internally in the framework or imported from external Monte Carlo detector simulation data. The latter have been provided by both the CMS and ATLAS experiments, featuring HL-LHC operating conditions and the specifications related to the Phase 2 upgrade. Pixel regions and double columns were simulated using such Monte Carlo data as inputs: the performance of different latency buffering architectures was compared and the compliance of different link speeds with the expected column data rate was verified

  7. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  8. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  9. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  10. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  11. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Kostyuchenko, V.I.; Makarova, A.S.; Ryazantsev, O.B.; Samarin, S.I.; Uglov, A.S.

    2013-01-01

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  12. Independent dose verification system with Monte Carlo simulations using TOPAS for passive scattering proton therapy at the National Cancer Center in Korea

    Science.gov (United States)

    Shin, Wook-Geun; Testa, Mauro; Kim, Hak Soo; Jeong, Jong Hwi; Byeong Lee, Se; Kim, Yeon-Joo; Min, Chul Hee

    2017-10-01

    For the independent validation of treatment plans, we developed a fully automated Monte Carlo (MC)-based patient dose calculation system with the tool for particle simulation (TOPAS) and proton therapy machine installed at the National Cancer Center in Korea to enable routine and automatic dose recalculation for each patient. The proton beam nozzle was modeled with TOPAS to simulate the therapeutic beam, and MC commissioning was performed by comparing percent depth dose with the measurement. The beam set-up based on the prescribed beam range and modulation width was automated by modifying the vendor-specific method. The CT phantom was modeled based on the DICOM CT files with TOPAS-built-in function, and an in-house-developed C++ code directly imports the CT files for positioning the CT phantom, RT-plan file for simulating the treatment plan, and RT-structure file for applying the Hounsfield unit (HU) assignment, respectively. The developed system was validated by comparing the dose distributions with those calculated by the treatment planning system (TPS) for a lung phantom and two patient cases of abdomen and internal mammary node. The results of the beam commissioning were in good agreement of up to 0.8 mm2 g-1 for B8 option in both of the beam range and the modulation width of the spread-out Bragg peaks. The beam set-up technique can predict the range and modulation width with an accuracy of 0.06% and 0.51%, respectively, with respect to the prescribed range and modulation in arbitrary points of B5 option (128.3, 132.0, and 141.2 mm2 g-1 of range). The dose distributions showed higher than 99% passing rate for the 3D gamma index (3 mm distance to agreement and 3% dose difference) between the MC simulations and the clinical TPS in the target volume. However, in the normal tissues, less favorable agreements were obtained for the radiation treatment planning with the lung phantom and internal mammary node cases. The discrepancies might come from the

  13. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  14. RapidArc treatment verification in 3D using polymer gel dosimetry and Monte Carlo simulation

    DEFF Research Database (Denmark)

    Ceberg, Sofie; Gagne, Isabel; Gustafsson, Helen

    2010-01-01

    The aim of this study was to verify the advanced inhomogeneous dose distribution produced by a volumetric arc therapy technique (RapidArc™) using 3D gel measurements and Monte Carlo (MC) simulations. The TPS (treatment planning system)-calculated dose distribution was compared with gel measurements...... and MC simulations, thus investigating any discrepancy between the planned dose delivery and the actual delivery. Additionally, the reproducibility of the delivery was investigated using repeated gel measurements. A prostate treatment plan was delivered to a 1.3 liter nPAG gel phantom using one single...... surface (VOI90), for the TPS versus gel and TPS versus MC. The differences between the verification methods, MC versus gel, and between two repeated gel measurements were investigated in the same way. For all volume comparisons, the mean value was within 1% and the standard deviation of the differences...

  15. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables...... to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow...

  16. DEVELOPMENT OF A MULTIMODAL MONTE CARLO BASED TREATMENT PLANNING SYSTEM.

    Science.gov (United States)

    Kumada, Hiroaki; Takada, Kenta; Sakurai, Yoshinori; Suzuki, Minoru; Takata, Takushi; Sakurai, Hideyuki; Matsumura, Akira; Sakae, Takeji

    2017-10-26

    To establish boron neutron capture therapy (BNCT), the University of Tsukuba is developing a treatment device and peripheral devices required in BNCT, such as a treatment planning system. We are developing a new multimodal Monte Carlo based treatment planning system (developing code: Tsukuba Plan). Tsukuba Plan allows for dose estimation in proton therapy, X-ray therapy and heavy ion therapy in addition to BNCT because the system employs PHITS as the Monte Carlo dose calculation engine. Regarding BNCT, several verifications of the system are being carried out for its practical usage. The verification results demonstrate that Tsukuba Plan allows for accurate estimation of thermal neutron flux and gamma-ray dose as fundamental radiations of dosimetry in BNCT. In addition to the practical use of Tsukuba Plan in BNCT, we are investigating its application to other radiation therapies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Enumeration Verification System (EVS)

    Data.gov (United States)

    Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...

  18. Vehicle usage verification system

    NARCIS (Netherlands)

    Scanlon, W.G.; McQuiston, Jonathan; Cotton, Simon L.

    2012-01-01

    EN)A computer-implemented system for verifying vehicle usage comprising a server capable of communication with a plurality of clients across a communications network. Each client is provided in a respective vehicle and with a respective global positioning system (GPS) by which the client can

  19. Central Verification System

    Data.gov (United States)

    US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...

  20. Moments of spectral functions: Monte Carlo evaluation and verification.

    Science.gov (United States)

    Predescu, Cristian

    2005-11-01

    The subject of the present study is the Monte Carlo path-integral evaluation of the moments of spectral functions. Such moments can be computed by formal differentiation of certain estimating functionals that are infinitely differentiable against time whenever the potential function is arbitrarily smooth. Here, I demonstrate that the numerical differentiation of the estimating functionals can be more successfully implemented by means of pseudospectral methods (e.g., exact differentiation of a Chebyshev polynomial interpolant), which utilize information from the entire interval . The algorithmic detail that leads to robust numerical approximations is the fact that the path-integral action and not the actual estimating functional are interpolated. Although the resulting approximation to the estimating functional is nonlinear, the derivatives can be computed from it in a fast and stable way by contour integration in the complex plane, with the help of the Cauchy integral formula (e.g., by Lyness' method). An interesting aspect of the present development is that Hamburger's conditions for a finite sequence of numbers to be a moment sequence provide the necessary and sufficient criteria for the computed data to be compatible with the existence of an inversion algorithm. Finally, the issue of appearance of the sign problem in the computation of moments, albeit in a milder form than for other quantities, is addressed.

  1. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  2. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  3. Application of a Monte Carlo linac model in routine verifications of dose calculations

    International Nuclear Information System (INIS)

    Linares Rosales, H. M.; Alfonso Laguardia, R.; Lara Mas, E.; Popescu, T.

    2015-01-01

    The analysis of some parameters of interest in Radiotherapy Medical Physics based on an experimentally validated Monte Carlo model of an Elekta Precise lineal accelerator, was performed for 6 and 15 Mv photon beams. The simulations were performed using the EGSnrc code. As reference for simulations, the optimal beam parameters values (energy and FWHM) previously obtained were used. Deposited dose calculations in water phantoms were done, on typical complex geometries commonly are used in acceptance and quality control tests, such as irregular and asymmetric fields. Parameters such as MLC scatter, maximum opening or closing position, and the separation between them were analyzed from calculations in water. Similarly simulations were performed on phantoms obtained from CT studies of real patients, making comparisons of the dose distribution calculated with EGSnrc and the dose distribution obtained from the computerized treatment planning systems (TPS) used in routine clinical plans. All the results showed a great agreement with measurements, finding all of them within tolerance limits. These results allowed the possibility of using the developed model as a robust verification tool for validating calculations in very complex situation, where the accuracy of the available TPS could be questionable. (Author)

  4. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  5. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  6. Formal System Verification for Trustworthy Embedded Systems

    Science.gov (United States)

    2011-04-19

    step is for the first time formal and machine-checked. Contemporary OS verification projects include Verisoft, Verisoft XT, and Verve . The Verisoft...tens of thousands lines of code. The Verve kernel [22] shows that type and memory safety properties can be established on the assembly level via type...systems and therefore with much lower cost. Verve contains a formally verified runtime system, in particular a garbage collector that the type system

  7. [Verification of the VEF photon beam model for dose calculations by the Voxel-Monte-Carlo-Algorithm].

    Science.gov (United States)

    Kriesen, Stephan; Fippel, Matthias

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tübingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning.

  8. Verification of the VEF photon beam model for dose calculations by the voxel-Monte-Carlo-algorithm

    International Nuclear Information System (INIS)

    Kriesen, S.; Fippel, M.

    2005-01-01

    The VEF linac head model (VEF, virtual energy fluence) was developed at the University of Tuebingen to determine the primary fluence for calculations of dose distributions in patients by the Voxel-Monte-Carlo-Algorithm (XVMC). This analytical model can be fitted to any therapy accelerator head by measuring only a few basic dose data; therefore, time-consuming Monte-Carlo simulations of the linac head become unnecessary. The aim of the present study was the verification of the VEF model by means of water-phantom measurements, as well as the comparison of this system with a common analytical linac head model of a commercial planning system (TMS, formerly HELAX or MDS Nordion, respectively). The results show that both the VEF and the TMS models can very well simulate the primary fluence. However, the VEF model proved superior in the simulations of scattered radiation and in the calculations of strongly irregular MLC fields. Thus, an accurate and clinically practicable tool for the determination of the primary fluence for Monte-Carlo-Simulations with photons was established, especially for the use in IMRT planning. (orig.)

  9. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  10. FLEXible Damage Detection and Verification System

    Data.gov (United States)

    National Aeronautics and Space Administration — This project expands on the previously demonstrated Flat Surface Damage Detection System (FSDDS) capabilities.  The Flexible Damage Detection and Verification System...

  11. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  12. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  13. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  14. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  15. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  16. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    based model checking style of verification. The next paper by D'Souza & Thiagarajan presents an automata-theoretic approach to analysing timing properties of systems. The last paper by Mohalik and Ramanujam presents the assumption.

  17. The PASCAL-HDM Verification System

    Science.gov (United States)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  18. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  19. Location Verification Systems Under Spatially Correlated Shadowing

    OpenAIRE

    Yan, Shihao; Nevat, Ido; Peters, Gareth W.; Malaney, Robert

    2014-01-01

    The verification of the location information utilized in wireless communication networks is a subject of growing importance. In this work we formally analyze, for the first time, the performance of a wireless Location Verification System (LVS) under the realistic setting of spatially correlated shadowing. Our analysis illustrates that anticipated levels of correlated shadowing can lead to a dramatic performance improvement of a Received Signal Strength (RSS)-based LVS. We also analyze the per...

  20. Verification of Monte Carlo calculations of the neutron flux in typical irradiation channels of the TRIGA reactor, Ljubljana

    NARCIS (Netherlands)

    Jacimovic, R; Maucec, M; Trkov, A

    2003-01-01

    An experimental verification of Monte Carlo neutron flux calculations in typical irradiation channels in the TRIGA Mark II reactor at the Jozef Stefan Institute is presented. It was found that the flux, as well as its spectral characteristics, depends rather strongly on the position of the

  1. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  2. Temporal logic runtime verification of dynamic systems

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-07-01

    Full Text Available Robotic computer systems are increasingly ubiquitous in everyday life and this has led to a need to develop safe and reliable systems. To ensure safety and reliability of these systems, the following three main verification techniques are usually...

  3. EDITORIAL: International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification

    Science.gov (United States)

    Verhaegen, Frank; Seuntjens, Jan

    2008-03-01

    Monte Carlo particle transport techniques offer exciting tools for radiotherapy research, where they play an increasingly important role. Topics of research related to clinical applications range from treatment planning, motion and registration studies, brachytherapy, verification imaging and dosimetry. The International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification took place in a hotel in Montreal in French Canada, from 29 May-1 June 2007, and was the third workshop to be held on a related topic, which now seems to have become a tri-annual event. About one hundred workers from many different countries participated in the four-day meeting. Seventeen experts in the field were invited to review topics and present their latest work. About half of the audience was made up by young graduate students. In a very full program, 57 papers were presented and 10 posters were on display during most of the meeting. On the evening of the third day a boat trip around the island of Montreal allowed participants to enjoy the city views, and to sample the local cuisine. The topics covered at the workshop included the latest developments in the most popular Monte Carlo transport algorithms, fast Monte Carlo, statistical issues, source modeling, MC treatment planning, modeling of imaging devices for treatment verification, registration and deformation of images and a sizeable number of contributions on brachytherapy. In this volume you will find 27 short papers resulting from the workshop on a variety of topics, some of them on very new stuff such as graphics processing units for fast computing, PET modeling, dual-energy CT, calculations in dynamic phantoms, tomotherapy devices, . . . . We acknowledge the financial support of the National Cancer Institute of Canada, the Institute of Cancer Research of the Canadian Institutes of Health Research, the Association Québécoise des Physicien(ne)s Médicaux Clinique, the Institute of Physics, and Medical

  4. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  5. Runtime Verification for Decentralised and Distributed Systems

    NARCIS (Netherlands)

    Francalanza, Adrian; Pérez, Jorge A.; Sánchez, César; Bartocci, Ezio; Falcone, Yliès

    This chapter surveys runtime verification research related to distributed systems. We report solutions that study how to monitor system with some distributed characteristic, solutions that use a distributed platform for performing a monitoring task, and foundational works that present semantics for

  6. Compositional Verification of Multi-Station Interlocking Systems

    DEFF Research Database (Denmark)

    Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth

    2016-01-01

    Because interlocking systems are highly safety-critical complex systems, their automated safety verification is an active research topic investigated by several groups, employing verification techniques to produce important cost and time savings in their certification. However, such systems also...

  7. High-level verification of system designs

    OpenAIRE

    Kundu, Sudipta

    2009-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. The growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high- level languages is to enable verification at a higher level of abstraction, allowing early exploration of system -level designs, the focus so far has ...

  8. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects. PMID:26217586

  9. Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.

    Science.gov (United States)

    Kraan, Aafke Christine

    2015-01-01

    Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.

  10. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  11. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid systems * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  12. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  13. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  14. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  15. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  16. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  17. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  18. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  19. Parametric Verification of Weighted Systems

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders

    2015-01-01

    This paper addresses the problem of parametric model checking for weighted transition systems. We consider transition systems labelled with linear equations over a set of parameters and we use them to provide semantics for a parametric version of weighted CTL where the until and next operators...... are themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose...

  20. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    Science.gov (United States)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  1. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  2. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  3. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  4. Validation and verification of the ORNL Monte Carlo codes for nuclear safety analysis

    International Nuclear Information System (INIS)

    Emmett, M.B.

    1993-01-01

    The process of ensuring the quality of computer codes can be very time consuming and expensive. The Oak Ridge National Laboratory (ORNL) Monte Carlo codes all predate the existence of quality assurance (QA) standards and configuration control. The number of person-years and the amount of money spent on code development make it impossible to adhere strictly to all the current requirements. At ORNL, the Nuclear Engineering Applications Section of the Computing Applications Division is responsible for the development, maintenance, and application of the Monte Carlo codes MORSE and KENO. The KENO code is used for doing criticality analyses; the MORSE code, which has two official versions, CGA and SGC, is used for radiation transport analyses. Because KENO and MORSE were very thoroughly checked out over the many years of extensive use both in the United States and in the international community, the existing codes were open-quotes baselined.close quotes This means that the versions existing at the time the original configuration plan is written are considered to be validated and verified code systems based on the established experience with them

  5. Verification of Monte Carlo calculations of the neutron flux in the carousel channels of the TRIGA Mark II reactor, Ljubljana

    International Nuclear Information System (INIS)

    Jacimovic, R.; Maucec, M.; Trkov, A.

    2002-01-01

    In this work experimental verification of Monte Carlo neutron flux calculations in the carousel facility (CF) of the 250 kW TRIGA Mark II reactor at the Jozef Stefan Institute is presented. Simulations were carried out using the Monte Carlo radiation-transport code, MCNP4B. The objective of the work was to model and verify experimentally the azimuthal variation of neutron flux in the CF for core No. 176, set up in April 2002. '1'9'8Au activities of Al-Au(0.1%) disks irradiated in 11 channels of the CF covering 180'0 around the perimeter of the core were measured. The comparison between MCNP calculation and measurement shows relatively good agreement and demonstrates the overall accuracy with which the detailed spectral characteristics can be predicted by calculations.(author)

  6. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  7. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Today formal verification is finding increasing acceptance in some areas, especially model abstraction and functional verification. Other major chal- lenges, like timing verification, remain before this technology can be posed as a complete alternative to simulation. This special issue is devoted to presenting some of the ...

  8. Monte Carlo study of the multiquark systems

    International Nuclear Information System (INIS)

    Kerbikov, B.O.; Polikarpov, M.I.; Zamolodchikov, A.B.

    1986-01-01

    Random walks have been used to calculate the energies of the ground states in systems of N=3, 6, 9, 12 quarks. Multiquark states with N>3 are unstable with respect to the spontaneous dissociation into color singlet hadrons. The modified Green's function Monte Carlo algorithm which proved to be more simple and much accurate than the conventional few body methods have been employed. In contrast to other techniques, the same equations are used for any number of particles, while the computer time increases only linearly V, S the number of particles

  9. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  10. Verification and Validation of Flight Critical Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  11. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  12. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  13. Verification and validation plan for the SFR system analysis module

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  14. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  15. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  16. Simulation Monte Carlo as a method of verification of the characterization of fountains in ophthalmic brachytherapy; Simulacion Monte Carlo como metodo de verificacion de la caracterizacion de fuentes en braquiterapia oftalmica

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz Lora, A.; Miras del Rio, H.; Terron Leon, J. A.

    2013-07-01

    Following the recommendations of the IAEA, and as a further check, they have been Monte Carlo simulation of each one of the plates that are arranged at the Hospital. The objective of the work is the verification of the certificates of calibration and intends to establish criteria of action for its acceptance. (Author)

  17. Integrated safety management system verification: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-10

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalization of an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR, 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System (ISMS). Guidance and expectations have been provided to PNNL by incorporation into the operating contract (Contract DE-ACM-76FL0 1830) and by letter. The contract requires that the contractor submit a description of their ISMS for approval by DOE. PNNL submitted their proposed Safety Management System Description for approval on November 25,1997. RL tentatively approved acceptance of the description pursuant to a favorable recommendation from this review. The Integrated Safety Management System Verification is a review of the adequacy of the ISMS description in fulfilling the requirements of the DEAR and the DOE Policy. The purpose of this review is to provide the Richland Operations Office Manager with a recommendation for approval of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and to verify the extent and maturity of ISMS implementation within the Laboratory. Further the review will provide a model for other DOE laboratories managed by the Office of Assistant Secretary for Energy Research.

  18. Entropy Measurement for Biometric Verification Systems.

    Science.gov (United States)

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  19. Integrated safety management system verification: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-12

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System. The Manager, Richland Operations Office (RL), initiated a combined Phase 1 and Phase 2 Integrated Safety Management Verification review to confirm that PNNL had successfully submitted a description of their ISMS and had implemented ISMS within the laboratory facilities and processes. A combined review was directed by the Manager, RL, based upon the progress PNNL had made in the implementation of ISM. This report documents the results of the review conducted to verify: (1) that the PNNL integrated safety management system description and enabling documents and processes conform to the guidance provided by the Manager, RL; (2) that corporate policy is implemented by line managers; (3) that PNNL has provided tailored direction to the facility management; and (4) the Manager, RL, has documented processes that integrate their safety activities and oversight with those of PNNL. The general conduct of the review was consistent with the direction provided by the Under Secretary`s Draft Safety Management System Review and Approval Protocol. The purpose of this review was to provide the Manager, RL, with a recommendation to the adequacy of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and, to provide an evaluation of the extent and maturity of ISMS implementation within the Laboratory. Further, this review was intended to provide a model for other DOE Laboratories. In an effort to reduce the time and travel costs associated with ISM verification the team agreed to conduct preliminary training and orientation electronically and by phone. These

  20. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  1. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  2. Verification of three dimensional triangular prismatic discrete ordinates transport code ENSEMBLE-TRIZ by comparison with Monte Carlo code GMVP

    International Nuclear Information System (INIS)

    Homma, Y.; Moriwaki, H.; Ikeda, K.; Ohdi, S.

    2013-01-01

    This paper deals with the verification of the 3 dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with the multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at the beginning of cycle of an initial core and at the beginning and the end of cycle of an equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multiplication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity. (authors)

  3. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  4. Vega library for processing DICOM data required in Monte Carlo verification of radiotherapy treatment plans

    International Nuclear Information System (INIS)

    Locke, C.; Zavgorodni, S.; British Columbia Cancer Agency, Vancouver Island Center, Victoria BC

    2008-01-01

    Monte Carlo (MC) methods provide the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations into treatment planning quality assurance process. This involves MC dose calculations for clinically produced treatment plans. To perform these calculations, a number of treatment plan parameters specifying radiation beam

  5. Verification of Monte Carlo calculations around a Fletcher Suit Delclos ovoid with normoxic polymer gel dosimetry

    International Nuclear Information System (INIS)

    Gifford, K; Horton, J; Steger, T; Heard, M; Jackson, E; Ibbott, G

    2004-01-01

    The goal of this work is to calculate the effect of including the anterior and posterior ovoid shields on the dose distribution around a Fletcher Suit Delclos (FSD) ovoid (Nucletron Trading BV, Leersum, Netherlands) and verify these calculations with normoxic polymer gel dosimetry. To date, no Monte Carlo results verified with dosimetry have been published for this ovoid

  6. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  7. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  8. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    standards [2] for railways to use formal (i.e. mathematical) logic and models for the unambiguous description of requirements and designs as well as for exhaustive verification as they give a higher assurance of safety compared to conventional methods. The use of domain-specific methods is another trend...... can be combined and used for an efficient development and verification of new fail-safe systems. The expected result is a methodology for using domain-specific, formal languages, techniques and tools for more efficient development and verification of robust software for railway control systems......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...

  9. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  10. Integrated testing and verification system for research flight software

    Science.gov (United States)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  11. An Optimized Signature Verification System for Vehicle Ad hoc NETwork

    OpenAIRE

    Mamun, Mohammad Saiful Islam; Miyaji, Atsuko

    2012-01-01

    This paper1 presents an efficient approach to an existing batch verification system on Identity based group signature (IBGS) which can be applied to any Mobile ad hoc network device including Vehicle Ad hoc Networks (VANET). We propose an optimized way to batch signatures in order to get maximum throughput from a device in runtime environment. In addition, we minimize the number of pairing computations in batch verification proposed by B. Qin et al. for large scale VANET. We introduce a batch...

  12. IDEF method for designing seismic information system in CTBT verification

    International Nuclear Information System (INIS)

    Zheng Xuefeng; Shen Junyi; Jin Ping; Zhang Huimin; Zheng Jiangling; Sun Peng

    2004-01-01

    Seismic information system is of great importance for improving the capability of CTBT verification. A large amount of money has been appropriated for the research in this field in the U.S. and some other countries in recent years. However, designing and developing a seismic information system involves various technologies about complex system design. This paper discusses the IDEF0 method to construct function models and the IDEF1x method to make information models systemically, as well as how they are used in designing seismic information system in CTBT verification. (authors)

  13. Towards Verification of Constituent Systems through Automated Proof

    DEFF Research Database (Denmark)

    Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R

    2014-01-01

    This paper explores verification of constituent systems within the context of the Symphony tool platform for Systems of Systems (SoS). Our SoS modelling language, CML, supports various contractual specification elements, such as state invariants and operation preconditions, which can be used...

  14. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  15. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  16. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train......In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  17. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  18. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  19. Verification of IMRT dose distributions using a water beam imaging system

    International Nuclear Information System (INIS)

    Li, J.S.; Boyer, Arthur L.; Ma, C.-M.

    2001-01-01

    A water beam imaging system (WBIS) has been developed and used to verify dose distributions for intensity modulated radiotherapy using dynamic multileaf collimator. This system consisted of a water container, a scintillator screen, a charge-coupled device camera, and a portable personal computer. The scintillation image was captured by the camera. The pixel value in this image indicated the dose value in the scintillation screen. Images of radiation fields of known spatial distributions were used to calibrate the device. The verification was performed by comparing the image acquired from the measurement with a dose distribution from the IMRT plan. Because of light scattering in the scintillator screen, the image was blurred. A correction for this was developed by recognizing that the blur function could be fitted to a multiple Gaussian. The blur function was computed using the measured image of a 10 cmx10 cm x-ray beam and the result of the dose distribution calculated using the Monte Carlo method. Based on the blur function derived using this method, an iterative reconstruction algorithm was applied to recover the dose distribution for an IMRT plan from the measured WBIS image. The reconstructed dose distribution was compared with Monte Carlo simulation result. Reasonable agreement was obtained from the comparison. The proposed approach makes it possible to carry out a real-time comparison of the dose distribution in a transverse plane between the measurement and the reference when we do an IMRT dose verification

  20. Total skin electron therapy treatment verification: Monte Carlo simulation and beam characteristics of large non-standard electron fields

    International Nuclear Information System (INIS)

    Pavon, Ester Carrasco; Sanchez-Doblado, Francisco; Leal, Antonio; Capote, Roberto; Lagares, Juan Ignacio; Perucha, Maria; Arrans, Rafael

    2003-01-01

    Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm 2 electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm 2 field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm 2 open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R 50 , R 80 , R 100 , R p , E 0 ) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results proved the applicability of using MC

  1. Measurability and Safety Verification for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger

    2011-01-01

    Dealing with the interplay of randomness and continuous time is important for the formal verification of many real systems. Considering both facets is especially important for wireless sensor networks, distributed control applications, and many other systems of growing importance. An important...

  2. Verification and Validation for Flight-Critical Systems (VVFCS)

    Science.gov (United States)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  3. Formal verification of automated teller machine systems using SPIN

    Science.gov (United States)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  4. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  5. MO-FG-BRA-01: 4D Monte Carlo Simulations for Verification of Dose Delivered to a Moving Anatomy

    Energy Technology Data Exchange (ETDEWEB)

    Gholampourkashi, S; Cygler, J E. [Carleton University Ottawa, ON (Canada); The Ottawa Hospital Cancer Centre, Ottawa, ON (Canada); Belec, J; Vujicic, M [The Ottawa Hospital Cancer Centre, Ottawa, ON (Canada); Heath, Emily [Carleton University Ottawa, ON (Canada)

    2016-06-15

    Purpose: To validate 4D Monte Carlo (MC) simulations of dose delivery by an Elekta Agility linear accelerator to a moving phantom. Methods: Monte Carlo simulations were performed using the 4DdefDOSXYZnrc/EGSnrc user code which samples a new geometry for each incident particle and calculates the dose in a continuously moving anatomy. A Quasar respiratory motion phantom with a lung insert containing a 3 cm diameter tumor was used for dose measurements on an Elekta Agility linac with the phantom in stationary and moving states. Dose to the center of tumor was measured using calibrated EBT3 film and the RADPOS 4D dosimetry system. A VMAT plan covering the tumor was created on the static CT scan of the phantom using Monaco V.5.10.02. A validated BEAMnrc model of our Elekta Agility linac was used for Monte Carlo simulations on stationary and moving anatomies. To compare the planned and delivered doses, linac log files recorded during measurements were used for the simulations. For 4D simulations, deformation vectors that modeled the rigid translation of the lung insert were generated as input to the 4DdefDOSXYZnrc code as well as the phantom motion trace recorded with RADPOS during the measurements. Results: Monte Carlo simulations and film measurements were found to agree within 2mm/2% for 97.7% of points in the film in the static phantom and 95.5% in the moving phantom. Dose values based on film and RADPOS measurements are within 2% of each other and within 2σ of experimental uncertainties with respect to simulations. Conclusion: Our 4D Monte Carlo simulation using the defDOSXYZnrc code accurately calculates dose delivered to a moving anatomy. Future work will focus on more investigation of VMAT delivery on a moving phantom to improve the agreement between simulation and measurements, as well as establishing the accuracy of our method in a deforming anatomy. This work was supported by the Ontario Consortium of Adaptive Interventions in Radiation Oncology (OCAIRO

  6. A multi-microcomputer system for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Berg, B.; Krasemann, H.

    1981-01-01

    We propose a microcomputer system which allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and presumably many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 68000 microprocessor. One attraction if this processor is that it allows up to 16 M Byte random access memory. (orig.)

  7. Multi-microcomputer system for Monte-Carlo calculations

    CERN Document Server

    Berg, B; Krasemann, H

    1981-01-01

    The authors propose a microcomputer system that allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 6800 microprocessor. One attraction of this processor is that it allows up to 16 M Byte random access memory.

  8. Automatic Verification of Railway Interlocking Systems: A Case Study

    DEFF Research Database (Denmark)

    Petersen, Jakob Lyng

    1998-01-01

    This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which is ba...

  9. Monte Carlo capabilities of the SCALE code system

    International Nuclear Information System (INIS)

    Rearden, B.T.; Petrie, L.M.; Peplow, D.E.; Bekar, K.B.; Wiarda, D.; Celik, C.; Perfetti, C.M.; Ibrahim, A.M.; Hart, S.W.D.; Dunn, M.E.; Marshall, W.J.

    2015-01-01

    Highlights: • Foundational Monte Carlo capabilities of SCALE are described. • Improvements in continuous-energy treatments are detailed. • New methods for problem-dependent temperature corrections are described. • New methods for sensitivity analysis and depletion are described. • Nuclear data, users interfaces, and quality assurance activities are summarized. - Abstract: SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2

  10. Applicability of quasi-Monte Carlo for lattice systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hartung, Tobias [King' s College London (United Kingdom). Dept. of Mathematics; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leovey, Hernan; Griewank, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Mathematics; Mueller-Preussker, Michael [Berlin Humboldt-Univ. (Germany). Dept. of Physics

    2013-11-15

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N{sup -1/2}, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N{sup -1}, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  11. Applicability of quasi-Monte Carlo for lattice systems

    International Nuclear Information System (INIS)

    Ammon, Andreas; Deutsches Elektronen-Synchrotron; Hartung, Tobias; Jansen, Karl; Leovey, Hernan; Griewank, Andreas; Mueller-Preussker, Michael

    2013-11-01

    This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N -1/2 , where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N -1 , or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.

  12. Software verification in on-line systems

    International Nuclear Information System (INIS)

    Ehrenberger, W.

    1980-01-01

    Operator assistance is more and more provided by computers. Computers contain programs, whose quality should be above a certain level, before they are allowed to be used in reactor control rooms. Several possibilities for gaining software reliability figures are discussed in this paper. By supervising the testing procedure of a program, one can estimate the number of remaining programming errors. Such an estimation, however, is not very accurate. With mathematical proving procedures one can gain some knowledge on program properties. Such proving procedures are important for the verification of general WHILE-loops, which tend to be error prone. The program analysis decomposes a program into its parts. First the program structure is made visible, which includes the data movements and the control flow. From this analysis test cases can be derived that lead to a complete test. Program analysis can be done by hand or automatically. A statistical program test normally requires a large number of test runs. This number is diminished if details concerning both the program to be tested or its use are known in advance. (orig.)

  13. Verification of a Monte Carlo model of the Missouri S&T reactor

    Science.gov (United States)

    Richardson, Brad Paul

    The purpose of this research is to ensure that an MCNP model of the Missouri S&T reactor produces accurate results so that it may be used to predict the effects of some desired upgrades to the reactor. The desired upgrades are an increase in licensed power from 200 kW to 400kW, and the installation of a secondary cooling system to prevent heating of the pool. This was performed by comparing simulations performed using the model with experiments performed using the reactor. The experiments performed were, the approach to criticality method of predicting the critical control rod height, measurement of the axial flux profile, moderator temperature coefficient of reactivity, and void coefficient of reactivity. The results of these experiments and results from the simulation show that the model produces a similar axial flux profile, and that it models the void and temperature coefficients of reactivity well. The model does however over-predict the criticality of the core, such that it predicts a lower critical rod height and a keff greater than one when simulating conditions in which the reactor was at a stable power. It is assumed that this is due to the model using fuel compositions from when the fuel was new, while in reality the reactor has been operating with this fuel for nearly 20 years. It has therefore been concluded that the fuel composition should be updated by performing a burnup analysis, and an accurate heat transfer and fluid flow analysis be performed to better represent the temperature profile before the model is used to simulate the effects of the desired upgrades.

  14. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  15. Initial performance of the advanced inventory verification sample system (AVIS)

    Energy Technology Data Exchange (ETDEWEB)

    Marlow, Johnna B [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Rael, Carlos D [Los Alamos National Laboratory

    2009-01-01

    This paper describes the requirements, design and initial performance of the Advanced Inventory Verification Sample System (AVIS) a non-destructive assay (NDA) system to measure small samples of bulk mixed uranium-plutonium oxide (MOX) materials (powders and pellets). The AVIS design has evolved from previously developed conceptual physics and engineering designs for the Inventory Sample Verification System (INVS), a safeguards system for nondestructive assay of small samples. The AVIS is an integrated gamma-neutron system. Jointly designed by the Nuclear Material Control Center (NMCC) and the Los Alamos National Laboratory (LANL), AVIS is intended to meet a performance specification of a total measurement uncertainty of less than 0.5% in the neutron ({sup 240}Pu{sub effective}) measurement. This will allow the AVIS to replace destructive chemical analysis for many samples, with concomitant cost, exposure and waste generation savings for the facility. Data taken to date confirming the performance of the AVIS is presented.

  16. Alien Registration Number Verification via the U.S. Citizenship and Immigration Service's Systematic Alien Verification for Entitlements System

    National Research Council Canada - National Science Library

    Ainslie, Frances M; Buck, Kelly R

    2008-01-01

    The purpose of this study was to evaluate the implications of conducting high-volume automated checks of the United States Citizenship and Immigration Services Systematic Allen Verification for Entitlements System (SAVE...

  17. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  18. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  19. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  20. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    International Nuclear Information System (INIS)

    Onizuka, R; Araki, F; Ohno, T; Nakaguchi, Y

    2016-01-01

    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30% of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.

  1. SU-F-T-364: Monte Carlo-Dose Verification of Volumetric Modulated Arc Therapy Plans Using AAPM TG-119 Test Patterns

    Energy Technology Data Exchange (ETDEWEB)

    Onizuka, R [Graduate School of Health Sciences, Kumamoto University (Japan); Araki, F; Ohno, T [Faculty of Life Sciences, Kumamoto University (Japan); Nakaguchi, Y [Kumamoto University Hospital (Japan)

    2016-06-15

    Purpose: To investigate the Monte Carlo (MC)-based dose verification for VMAT plans by a treatment planning system (TPS). Methods: The AAPM TG-119 test structure set was used for VMAT plans by the Pinnacle3 (convolution/superposition), using a Synergy radiation head of a 6 MV beam with the Agility MLC. The Synergy was simulated with the EGSnrc/BEAMnrc code, and VMAT dose distributions were calculated with the EGSnrc/DOSXYZnrc code by the same irradiation conditions as TPS. VMAT dose distributions of TPS and MC were compared with those of EBT3 film, by 2-D gamma analysis of ±3%/3 mm criteria with a threshold of 30% of prescribed doses. VMAT dose distributions between TPS and MC were also compared by DVHs and 3-D gamma analysis of ±3%/3 mm criteria with a threshold of 10%, and 3-D passing rates for PTVs and OARs were analyzed. Results: TPS dose distributions differed from those of film, especially for Head & neck. The dose difference between TPS and film results from calculation accuracy for complex motion of MLCs like tongue and groove effect. In contrast, MC dose distributions were in good agreement with those of film. This is because MC can model fully the MLC configuration and accurately reproduce the MLC motion between control points in VMAT plans. D95 of PTV for Prostate, Head & neck, C-shaped, and Multi Target was 97.2%, 98.1%, 101.6%, and 99.7% for TPS and 95.7%, 96.0%, 100.6%, and 99.1% for MC, respectively. Similarly, 3-D gamma passing rates of each PTV for TPS vs. MC were 100%, 89.5%, 99.7%, and 100%, respectively. 3-D passing rates of TPS reduced for complex VMAT fields like Head & neck because MLCs are not modeled completely for TPS. Conclusion: MC-calculated VMAT dose distributions is useful for the 3-D dose verification of VMAT plans by TPS.

  2. Orion GN&C Fault Management System Verification: Scope And Methodology

    Science.gov (United States)

    Brown, Denise; Weiler, David; Flanary, Ronald

    2016-01-01

    In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.

  3. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  4. The verification of neutron activation analysis support system (cooperative research)

    International Nuclear Information System (INIS)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k 0 method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k 0 method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  5. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour of a re...

  6. Monte Carlo capabilities of the SCALE code system

    International Nuclear Information System (INIS)

    Rearden, B.T.; Petrie, L.M.; Peplow, D.E.; Bekar, K.B.; Wiarda, D.; Celik, C.; Perfetti, C.M.; Ibrahim, A.M.; Dunn, M.E.; Hart, S.W.D.

    2013-01-01

    SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a 'plug-and-play' framework that includes three deterministic and three Monte Carlo radiation transport solvers (KENO, MAVRIC, TSUNAMI) that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE's graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2, to be released in 2014, will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2. (authors)

  7. Formal modelling and verification of interlocking systems featuring sequential release

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2017-01-01

    In this article, we present a method and an associated toolchain for the formal verification of the new Danish railway interlocking systems that are compatible with the European Train Control System (ETCS) Level 2. We have made a generic and reconfigurable model of the system behaviour and generic...... safety properties. This model accommodates sequential release - a feature in the new Danish interlocking systems. To verify the safety of an interlocking system, first a domain-specific description of interlocking configuration data is constructed and validated. Then the generic model and safety...

  8. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  9. Specification and Verification of Secure Concurrent and Distributed Software Systems

    Science.gov (United States)

    1992-02-01

    reduce in SERUPNOZ : *%tsIde(v(v(p(ep(p(0)))) rewite: 23 result Zero: 0 rede In SOUNDOU : outside (op(v(V(r(p(0)))) rewrites: 19 result 2254t: I...Transactions On Programming Languages And Systems, 6(2):159-174, 1984. 252 [BW821 M. Broy and M. Wirsing. Partial abstract types. Acta Informatica , 18, 1982...Acta Informatica , 24, 1987. [DE82] R. Dannenberg and G. Ernst. Formal program verification using symbolic execution. IEEE Transactions on Software

  10. Location Verification Systems for VANETs in Rician Fading Channels

    OpenAIRE

    Yan, Shihao; Malaney, Robert; Nevat, Ido; Peters, Gareth W.

    2014-01-01

    In this work we propose and examine Location Verification Systems (LVSs) for Vehicular Ad Hoc Networks (VANETs) in the realistic setting of Rician fading channels. In our LVSs, a single authorized Base Station (BS) equipped with multiple antennas aims to detect a malicious vehicle that is spoofing its claimed location. We first determine the optimal attack strategy of the malicious vehicle, which in turn allows us to analyze the optimal LVS performance as a function of the Rician $K$-factor o...

  11. Study of TXRF experimental system by Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Ana Cristina M.; Leitao, Roberta G.; Lopes, Ricardo T., E-mail: ricardo@lin.ufrj.br [Nuclear Instrumentation Laboratory, Nuclear Engineering Program/COPPE Federal University of Rio de Janeiro (UFRJ), RJ (Brazil); Anjos, Marcelino J., E-mail: marcelin@uerj.br [State University of Rio de Janeiro (UERJ/IFADT/DFAT), RJ (Brazil); Conti, Claudio C., E-mail: ccconti@ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de janeiro, RJ (Brazil)

    2011-07-01

    The Total-Reflection X-ray Fluorescence (TXRF) technique offers unique possibilities to study the concentrations of a wide range of trace elements in various types of samples. Besides that, the TXRF technique is widely used to study the trace elements in biological, medical and environmental samples due to its multielemental character as well as simplicity of sample preparation and quantification methods used. In general the TXRF experimental setup is not simple and might require substantial experimental efforts. On the other hand, in recent years, experimental TXRF portable systems have been developed. It has motivated us to develop our own TXRF portable system. In this work we presented a first step in order to optimize a TXRF experimental setup using Monte Carlo simulation by MCNP code. The results found show that the Monte Carlo simulation method can be used to investigate the development of a TXRF experimental system before its assembly. (author)

  12. System Identification and Verification of Rotorcraft UAVs

    Science.gov (United States)

    Carlton, Zachary M.

    The task of a controls engineer is to design and implement control logic. To complete this task, it helps tremendously to have an accurate model of the system to be controlled. Obtaining a very accurate system model is not a trivial one, as much time and money is usually associated with the development of such a model. A typical physics based approach can require hundreds of hours of flight time. In an iterative process the model is tuned in such a way that it accurately models the physical system's response. This process becomes even more complicated for unstable and highly non-linear systems such as the dynamics of rotorcraft. An alternate approach to solving this problem is to extract an accurate model by analyzing the frequency response of the system. This process involves recording the system's responses for a frequency range of input excitations. From this data, an accurate system model can then be deduced. Furthermore, it has been shown that with use of the software package CIFER® (Comprehensive Identification from FrEquency Responses), this process can both greatly reduce the cost of modeling a dynamic system and produce very accurate results. The topic of this thesis is to apply CIFER® to a quadcopter to extract a system model for the flight condition of hover. The quadcopter itself is comprised of off-the-shelf components with a Pixhack flight controller board running open source Ardupilot controller logic. In this thesis, both the closed and open loop systems are identified. The model is next compared to dissimilar flight data and verified in the time domain. Additionally, the ESC (Electronic Speed Controller) motor/rotor subsystem, which is comprised of all the vehicle's actuators, is also identified. This process required the development of a test bench environment, which included a GUI (Graphical User Interface), data pre and post processing, as well as the augmentation of the flight controller source code. This augmentation of code allowed for

  13. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  14. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  15. Models and formal verification of multiprocessor system-on-chips

    DEFF Research Database (Denmark)

    Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan

    2008-01-01

    , the configuration of the execution platform and the mapping of the application onto this platform. The computational model provides a basis for formal analysis of systems. The model is translated to timed automata and a tool for system verification and simulation has been developed using Uppaal as backend. We...... a discrete model of computation for such systems and characterize the size of the computation tree it suffices to consider when checking for schedulability. Analysis of multiprocessor system on chips is a major challenge due to the freedom of interrelated choices concerning the application level...

  16. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  17. On the Symbolic Verification of Timed Systems

    DEFF Research Database (Denmark)

    Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif

    1999-01-01

    This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...

  18. Verification station for Sandia/Rockwell Plutonium Protection system

    International Nuclear Information System (INIS)

    Nicholson, N.; Hastings, R.D.; Henry, C.N.; Millegan, D.R.

    1979-04-01

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  19. Voice activity detection for speaker verification systems

    Science.gov (United States)

    Borowski, Filip

    2008-01-01

    Complex algorithm for speech activity detection was presented in this article. It is based on speech enhancement, features extraction and final detection algorithm. The first one was published in ETSI standard as a module of "Advanced front-end feature extraction algorithm" in distributed speech recognition system. It consists of two main parts, noise estimatiom and Wiener filtering. For the final detection modified linear prediction coefficients and spectral entropy features are extracted form denoised signal.

  20. Improved local lattice Monte Carlo simulation for charged systems

    Science.gov (United States)

    Jiang, Jian; Wang, Zhen-Gang

    2018-03-01

    Maggs and Rossetto [Phys. Rev. Lett. 88, 196402 (2002)] proposed a local lattice Monte Carlo algorithm for simulating charged systems based on Gauss's law, which scales with the particle number N as O(N). This method includes two degrees of freedom: the configuration of the mobile charged particles and the electric field. In this work, we consider two important issues in the implementation of the method, the acceptance rate of configurational change (particle move) and the ergodicity in the phase space sampled by the electric field. We propose a simple method to improve the acceptance rate of particle moves based on the superposition principle for electric field. Furthermore, we introduce an additional updating step for the field, named "open-circuit update," to ensure that the system is fully ergodic under periodic boundary conditions. We apply this improved local Monte Carlo simulation to an electrolyte solution confined between two low dielectric plates. The results show excellent agreement with previous theoretical work.

  1. Application of Monte Carlo Method to Test Fingerprinting System for Dry Storage Canister

    International Nuclear Information System (INIS)

    Ahn, Gil Hoon; Park, Il-Jin; Min, Gyung Sik

    2006-01-01

    From 1992, dry storage canisters have been used for long-term disposition of the CANDU spent fuel bundles at Wolsong. Periodic inspection of the dual seals is currently the only measure that exists to verify that the contents have not been altered. So, verification for spent nuclear fuel in the dry storage is one of the important safeguarding tasks because the spent fuel contains significant quantities of fissile material. Although traditional non-destructive analysis and assay techniques to verify contents are ineffective due to shielding of spent fuel and canister wall, straggling position of detector, etc., Manual measurement of the radiation levels present in the reverification tubes that run along the length of the canister to enable the radiation profile within the canister is presently the most reliable method for ensuring that the stored materials are still present. So, gamma-ray fingerprinting method has been used after a canister is sealed in Korea to provide a continuity of knowledge that canister contents remain as loaded. The present study aims at test of current fingerprinting system using the MCNPX that is a well known and widely-used Monte Carlo radiation transport code, which may be useful in the verification measures of the spent fuel subject with final disposal guidance criterion(4kg of Pu, 0.5 SQ)

  2. Monte Carlo simulation of hybrid systems: An example

    International Nuclear Information System (INIS)

    Bacha, F.; D'Alencon, H.; Grivelet, J.; Jullien, E.; Jejcic, A.; Maillard, J.; Silva, J.; Zukanovich, R.; Vergnes, J.

    1997-01-01

    Simulation of hybrid systems needs tracking of particles from the GeV (incident proton beam) range down to a fraction of eV (thermic neutrons). We show how a GEANT based Monte-Carlo program can achieve this, with a realistic computer time and accompanying tools. An example of a dedicated original actinide burner is simulated with this chain. 8 refs., 5 figs

  3. Geometrical verification system using Adobe Photoshop in radiotherapy.

    Science.gov (United States)

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  4. LHC Beam Loss Monitoring System Verification Applications

    CERN Document Server

    Dehning, B; Zamantzas, C; Jackson, S

    2011-01-01

    The LHC Beam Loss Mon­i­tor­ing (BLM) sys­tem is one of the most com­plex in­stru­men­ta­tion sys­tems de­ployed in the LHC. In ad­di­tion to protecting the col­lid­er, the sys­tem also needs to pro­vide a means of di­ag­nos­ing ma­chine faults and de­liv­er a feed­back of loss­es to the control room as well as to sev­er­al sys­tems for their setup and analysis. It has to trans­mit and pro­cess sig­nals from al­most 4’000 mon­i­tors, and has near­ly 3 mil­lion con­fig­urable pa­ram­e­ters. The system was de­signed with re­li­a­bil­i­ty and avail­abil­i­ty in mind. The spec­i­fied op­er­a­tion and the fail-safe­ty stan­dards must be guar­an­teed for the sys­tem to per­form its func­tion in pre­vent­ing su­per­con­duc­tive mag­net de­struc­tion caused by par­ti­cle flux. Main­tain­ing the ex­pect­ed re­li­a­bil­i­ty re­quires ex­ten­sive test­ing and ver­i­fi­ca­tion. In this paper we re­port our most re­cent ad­di­t...

  5. Subtle Monte Carlo Updates in Dense Molecular Systems

    DEFF Research Database (Denmark)

    Bottaro, Sandro; Boomsma, Wouter; Johansson, Kristoffer E.

    2012-01-01

    as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results......Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce...... a kinetic algorithm, CRISP, that greatly enhances the sampling efficiency in all-atom MC simulations of dense systems. The algorithm is based on an exact analytical solution to the classic chain-closure problem, making it possible to express the interdependencies among degrees of freedom in the molecule...

  6. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  7. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  8. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  9. An evaluation of the management system verification pilot at Hanford

    International Nuclear Information System (INIS)

    Briggs, C.R.; Ramonas, L.; Westendorf, W.

    1998-01-01

    The Chemical Management System (CMS), currently under development at Hanford, was used as the ''test program'' for pilot testing the value added aspects of the Chemical Manufacturers Association's (CMA) Management Systems Verification (MSV) process. The MSV process, which was developed by CMA's member chemical companies specifically as a tool to assist in the continuous improvement of environment, safety and health (ESH) performance, represents a commercial sector ''best practice'' for evaluating ESH management systems. The primary purpose of Hanford's MSV Pilot was to evaluate the applicability and utility of the MSV process in the Department of Energy (DOE) environment. However, because the Integrated Safety Management System (ISMS) is the framework for ESH management at Hanford and at all DOE sites, the pilot specifically considered the MSV process in the context of a possible future adjunct to Integrated Safety Management System Verification (ISMSV) efforts at Hanford and elsewhere within the DOE complex. The pilot involved the conduct of two-hour interviews with four separate panels of individuals with functional responsibilities related to the CMS including the Department of Energy Richland Operations (DOE-RL), Fluor Daniel Hanford (FDH) and FDH's major subcontractors (MSCS). A semi-structured interview process was employed by the team of three ''verifiers'' who directed open-ended questions to the panels regarding the development, integration and effectiveness of management systems necessary to ensure the sustainability of the CMS effort. An ''MSV Pilot Effectiveness Survey'' also was completed by each panel participant immediately following the interview

  10. Active Learning of Markov Decision Processes for System Verification

    DEFF Research Database (Denmark)

    Chen, Yingke; Nielsen, Thomas Dyhre

    2012-01-01

    Formal model verification has proven a powerful tool for verifying and validating the properties of a system. Central to this class of techniques is the construction of an accurate formal model for the system being investigated. Unfortunately, manual construction of such models can be a resource...... demanding process, and this shortcoming has motivated the development of algorithms for automatically learning system models from observed system behaviors. Recently, algorithms have been proposed for learning Markov decision process representations of reactive systems based on alternating sequences...... of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...

  11. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  12. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  13. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  14. Developing a Verification and Training Phantom for Gynecological Brachytherapy System

    Directory of Open Access Journals (Sweden)

    Mahbobeh Nazarnejad

    2012-03-01

    Full Text Available Introduction Dosimetric accuracy is a major issue in the quality assurance (QA program for treatment planning systems (TPS. An important contribution to this process has been a proper dosimetry method to guarantee the accuracy of delivered dose to the tumor. In brachytherapy (BT of gynecological (Gyn cancer it is usual to insert a combination of tandem and ovoid applicators with a complicated geometry which makes their dosimetry verification difficult and important. Therefore, evaluation and verification of dose distribution is necessary for accurate dose delivery to the patients. Materials and Methods The solid phantom was made from Perspex slabs as a tool for intracavitary brachytherapy dosimetric QA. Film dosimetry (EDR2 was done for a combination of ovoid and tandem applicators introduced by Flexitron brachytherapy system. Treatment planning was also done with Flexiplan 3D-TPS to irradiate films sandwiched between phantom slabs. Isodose curves obtained from treatment planning system and the films were compared with each other in 2D and 3D manners. Results The brachytherapy solid phantom was constructed with slabs. It was possible to insert tandems and ovoids loaded with radioactive source of Ir-192 subsequently. Relative error was 3-8.6% and average relative error was 5.08% in comparison with the films and TPS isodose curves. Conclusion Our results showed that the difference between TPS and the measurements is well within the acceptable boundaries and below the action level according to AAPM TG.45. Our findings showed that this phantom after minor corrections can be used as a method of choice for inter-comparison analysis of TPS and to fill the existing gap for accurate QA program in intracavitary brachytherapy. The constructed phantom also showed that it can be a valuable tool for verification of accurate dose delivery to the patients as well as training for brachytherapy residents and physics students.

  15. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    -arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude......Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed...... of memory savings while preserving comparable verification times....

  16. Verification of Opacity and Diagnosability for Pushdown Systems

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2013-01-01

    Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.

  17. Specification and verification of the RTOS for plant protection systems

    International Nuclear Information System (INIS)

    Kim, Jin Hyun; Ahn, Young Ah; Lee, Su-Young; Choi, Jin Young; Lee, Na Young

    2004-01-01

    PLC is a computer system for instrumentation and control (I and C) systems such as control of machinery on factory assembly lines. control of machinery on factory assembly lines and Nucleare power plants. In nuclear power industry, systems is classified into 3 classes- Non-safety, safety-related and safety-critical up to integrity on system's using purpose. If PLC is used for controlling reactor in nuclear power plant, it should be identified as safety-critical. PLC has several I and C logics in software, including real-time operating system (RTOS). Hence, RTOS must be also proved that it is safe and reliable by various way and methods. In this paper, we apply formal methods to a development of RTOS for PLC in safety-critical level; Statecharts for specification and model checking for verification. In this paper, we give the results of applying formal methods to RTOS. (author)

  18. Verification of the model of a photon beam of 6 MV in a Monte Carlo planning comparison with collapsed cone in in homogeneous medium; Verificacion del modelado de un haz de fotones de 6 MV en un planificador Monte Carlo. Comparacion con Collapsed Cone en medio no homogeneo

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.

    2013-07-01

    We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)

  19. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  20. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  1. Electroacoustic verification of frequency modulation systems in cochlear implant users.

    Science.gov (United States)

    Fidêncio, Vanessa Luisa Destro; Jacob, Regina Tangerino de Souza; Tanamati, Liége Franzini; Bucuvic, Érika Cristina; Moret, Adriane Lima Mortari

    2017-12-26

    The frequency modulation system is a device that helps to improve speech perception in noise and is considered the most beneficial approach to improve speech recognition in noise in cochlear implant users. According to guidelines, there is a need to perform a check before fitting the frequency modulation system. Although there are recommendations regarding the behavioral tests that should be performed at the fitting of the frequency modulation system to cochlear implant users, there are no published recommendations regarding the electroacoustic test that should be performed. Perform and determine the validity of an electroacoustic verification test for frequency modulation systems coupled to different cochlear implant speech processors. The sample included 40 participants between 5 and 18 year's users of four different models of speech processors. For the electroacoustic evaluation, we used the Audioscan Verifit device with the HA-1 coupler and the listening check devices corresponding to each speech processor model. In cases where the transparency was not achieved, a modification was made in the frequency modulation gain adjustment and we used the Brazilian version of the "Phrases in Noise Test" to evaluate the speech perception in competitive noise. It was observed that there was transparency between the frequency modulation system and the cochlear implant in 85% of the participants evaluated. After adjusting the gain of the frequency modulation receiver in the other participants, the devices showed transparency when the electroacoustic verification test was repeated. It was also observed that patients demonstrated better performance in speech perception in noise after a new adjustment, that is, in these cases; the electroacoustic transparency caused behavioral transparency. The electroacoustic evaluation protocol suggested was effective in evaluation of transparency between the frequency modulation system and the cochlear implant. Performing the adjustment of

  2. Crew Exploration Vehicle (CEV) Potable Water System Verification Description

    Science.gov (United States)

    Peterson, Laurie; DeVera, Jean; Vega, Leticia; Adam, Nik; Steele, John; Gazda, Daniel; Roberts, Michael

    2009-01-01

    The Crew Exploration Vehicle (CEV), also known as Orion, will ferry a crew of up to six astronauts to the International Space Station (ISS), or a crew of up to four astronauts to the moon. The first launch of CEV is scheduled for approximately 2014. A stored water system on the CEV will supply the crew with potable water for various purposes: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain quality of the water transferred from the Orbiter to the ISS and stored in Contingency Water Containers (CWCs). In the CEV water system, the ionic silver biocide is expected to be depleted from solution due to ionic silver plating onto the surfaces of the materials within the CEV water system, thus negating its effectiveness as a biocide. Since the biocide depletion is expected to occur within a short amount of time after loading the water into the CEV water tanks at the Kennedy Space Center (KSC), an additional microbial control is a 0.1 micron point of use filter that will be used at the outlet of the Potable Water Dispenser (PWD). Because this may be the first time NASA is considering a stored water system for longterm missions that does not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point of use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform testing to help alleviate those concerns related to the CEV water system. Results from the test plans laid out in the paper presented to SAE last year (Crew Exploration Vehicle (CEV) Potable Water System Verification Coordination, 2008012083) will be detailed in this paper. Additionally, recommendations for the CEV verification will be described for risk mitigation in meeting the physicochemical and microbiological requirements on the CEV PWS.

  3. Computational methods for the verification of adaptive control systems

    Science.gov (United States)

    Prasanth, Ravi K.; Boskovic, Jovan; Mehra, Raman K.

    2004-08-01

    Intelligent and adaptive control systems will significantly challenge current verification and validation (V&V) processes, tools, and methods for flight certification. Although traditional certification practices have produced safe and reliable flight systems, they will not be cost effective for next-generation autonomous unmanned air vehicles (UAVs) due to inherent size and complexity increases from added functionality. Affordable V&V of intelligent control systems is by far the most important challenge in the development of UAVs faced by both commercial and military aerospace industry in the United States. This paper presents a formal modeling framework for a class of adaptive control systems and an associated computational scheme. The class of systems considered include neural network-based flight control systems and vehicle health management systems. This class of systems and indeed all adaptive systems are hybrid systems whose continuum dynamics is nonlinear. Our computational procedure is iterative and each iteration has two sequential steps. The first step is to derive an approximating finite-state automaton whose behaviors contain the behaviors of the hybrid system. The second step is to check if the language accepted by the approximating automaton is empty (emptiness checking). The iterations are terminated if the language accepted is empty; otherwise, the approximation is refined and the iteration is continued. This procedure will never produce an "error-free" certificate when the actual system contains errors which is an important requirement in V&V of safety critical systems.

  4. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR - BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND... Sensor -Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...covers the development of a torque sensor for verification and validation (V&V) of spacecraft attitude control actuators. The developed sensor directly

  5. Improving system modeling accuracy with Monte Carlo codes

    International Nuclear Information System (INIS)

    Johnson, A.S.

    1996-01-01

    The use of computer codes based on Monte Carlo methods to perform criticality calculations has become common-place. Although results frequently published in the literature report calculated k eff values to four decimal places, people who use the codes in their everyday work say that they only believe the first two decimal places of any result. The lack of confidence in the computed k eff values may be due to the tendency of the reported standard deviation to underestimate errors associated with the Monte Carlo process. The standard deviation as reported by the codes is the standard deviation of the mean of the k eff values for individual generations in the computer simulation, not the standard deviation of the computed k eff value compared with the physical system. A more subtle problem with the standard deviation of the mean as reported by the codes is that all the k eff values from the separate generations are not statistically independent since the k eff of a given generation is a function of k eff of the previous generation, which is ultimately based on the starting source. To produce a standard deviation that is more representative of the physical system, statistically independent values of k eff are needed

  6. Technology verification phase. Dynamic isotope power system. Final report

    International Nuclear Information System (INIS)

    Halsey, D.G.

    1982-01-01

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance

  7. Technology verification phase. Dynamic isotope power system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight system design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)

  8. Interacting multiagent systems kinetic equations and Monte Carlo methods

    CERN Document Server

    Pareschi, Lorenzo

    2014-01-01

    The description of emerging collective phenomena and self-organization in systems composed of large numbers of individuals has gained increasing interest from various research communities in biology, ecology, robotics and control theory, as well as sociology and economics. Applied mathematics is concerned with the construction, analysis and interpretation of mathematical models that can shed light on significant problems of the natural sciences as well as our daily lives. To this set of problems belongs the description of the collective behaviours of complex systems composed by a large enough number of individuals. Examples of such systems are interacting agents in a financial market, potential voters during political elections, or groups of animals with a tendency to flock or herd. Among other possible approaches, this book provides a step-by-step introduction to the mathematical modelling based on a mesoscopic description and the construction of efficient simulation algorithms by Monte Carlo methods. The ar...

  9. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  10. Beam intensity scanner system for three dimensional dose verification of IMRT

    International Nuclear Information System (INIS)

    Vahc, Young W.; Kwon, Ohyun; Park, Kwangyl; Park, Kyung R.; Yi, Byung Y.; Kim, Keun M.

    2003-01-01

    Patient dose verification is clinically one of the most important parts in the treatment delivery of radiation therapy. The three dimensional (3D) reconstruction of dose distribution delivered to target volume helps to verify patient dose and determine the physical characteristics of beams used in IMRT. Here we present beam intensity scanner (BInS) system for the pre-treatment dosimetric verification of two dimensional photon intensity. The BInS is a radiation detector with a custom-made software for dose conversion of fluorescence signals from scintillator. The scintillator is used to produce fluorescence from the irradiation of 6 MV photons on a Varian Clinac 21EX. The digitized fluoroscopic signals obtained by digital video camera-based scintillator (DVCS) will be processed by our custom made software to reproduce 3D- relative dose distribution. For the intensity modulated beam (IMB), the BInS calculates absorbed dose in absolute beam fluence which is used for the patient dose distribution. Using BInS, we performed various measurements related to IMRT and found the following: (1) The 3D-dose profiles of the IMBs measured by the BInS demonstrate good agreement with radiographic film, pin type ionization chamber and Monte Carlo simulation. (2) The delivered beam intensity is altered by the mechanical and dosimetric properties of the collimation of dynamic and/or step MLC system. This is mostly due to leaf transmission, leaf penumbra scattered photons from the round edges of leaves, and geometry of leaf. (3) The delivered dose depends on the operational detail of how to make multi leaf opening. These phenomena result in a fluence distribution that can be substantially different from the initial and calculated intensity modulation and therefore, should be taken into account by the treatment planning for accurate dose calculations delivered to the target volume in IMRT. (author)

  11. Monte Carlo simulations of quantum systems on massively parallel supercomputers

    International Nuclear Information System (INIS)

    Ding, H.Q.

    1993-01-01

    A large class of quantum physics applications uses operator representations that are discrete integers by nature. This class includes magnetic properties of solids, interacting bosons modeling superfluids and Cooper pairs in superconductors, and Hubbard models for strongly correlated electrons systems. This kind of application typically uses integer data representations and the resulting algorithms are dominated entirely by integer operations. The authors implemented an efficient algorithm for one such application on the Intel Touchstone Delta and iPSC/860. The algorithm uses a multispin coding technique which allows significant data compactification and efficient vectorization of Monte Carlo updates. The algorithm regularly switches between two data decompositions, corresponding naturally to different Monte Carlo updating processes and observable measurements such that only nearest-neighbor communications are needed within a given decomposition. On 128 nodes of Intel Delta, this algorithm updates 183 million spins per second (compared to 21 million on CM-2 and 6.2 million on a Cray Y-MP). A systematic performance analysis shows a better than 90% efficiency in the parallel implementation

  12. Research on Monte Carlo simulation method of industry CT system

    International Nuclear Information System (INIS)

    Li Junli; Zeng Zhi; Qui Rui; Wu Zhen; Li Chunyan

    2010-01-01

    There are a series of radiation physical problems in the design and production of industry CT system (ICTS), including limit quality index analysis; the effect of scattering, efficiency of detectors and crosstalk to the system. Usually the Monte Carlo (MC) Method is applied to resolve these problems. Most of them are of little probability, so direct simulation is very difficult, and existing MC methods and programs can't meet the needs. To resolve these difficulties, particle flux point auto-important sampling (PFPAIS) is given on the basis of auto-important sampling. Then, on the basis of PFPAIS, a particular ICTS simulation method: MCCT is realized. Compared with existing MC methods, MCCT is proved to be able to simulate the ICTS more exactly and effectively. Furthermore, the effects of all kinds of disturbances of ICTS are simulated and analyzed by MCCT. To some extent, MCCT can guide the research of the radiation physical problems in ICTS. (author)

  13. OGRE, Monte-Carlo System for Gamma Transport Problems

    International Nuclear Information System (INIS)

    1984-01-01

    1 - Nature of physical problem solved: The OGRE programme system was designed to calculate, by Monte Carlo methods, any quantity related to gamma-ray transport. The system is represented by two examples - OGRE-P1 and OGRE-G. The OGRE-P1 programme is a simple prototype which calculates dose rate on one side of a slab due to a plane source on the other side. The OGRE-G programme, a prototype of a programme utilizing a general-geometry routine, calculates dose rate at arbitrary points. A very general source description in OGRE-G may be employed by reading a tape prepared by the user. 2 - Method of solution: Case histories of gamma rays in the prescribed geometry are generated and analyzed to produce averages of any desired quantity which, in the case of the prototypes, are gamma-ray dose rates. The system is designed to achieve generality by ease of modification. No importance sampling is built into the prototypes, a very general geometry subroutine permits the treatment of complicated geometries. This is essentially the same routine used in the O5R neutron transport system. Boundaries may be either planes or quadratic surfaces, arbitrarily oriented and intersecting in arbitrary fashion. Cross section data is prepared by the auxiliary master cross section programme XSECT which may be used to originate, update, or edit the master cross section tape. The master cross section tape is utilized in the OGRE programmes to produce detailed tables of macroscopic cross sections which are used during the Monte Carlo calculations. 3 - Restrictions on the complexity of the problem: Maximum cross-section array information may be estimated by a given formula for a specific problem. The number of regions must be less than or equal to 50

  14. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-04] Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY: Office of the Chief Information Officer... Following Information Title of Proposal: Enterprise Income Verification (EIV) System- Debts Owed to PHAs and...

  15. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Science.gov (United States)

    2010-01-26

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5376-N-05] Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... lists the following information: Title of Proposal: Enterprise Income Verification (EIV) System User...

  16. Control and verification of industrial hybrid systems using models specified with the formalism $ chi $

    NARCIS (Netherlands)

    J.J.H. Fey

    1996-01-01

    textabstractControl and verification of hybrid systems is studied using two industrial examples. The hybrid models of a conveyor-belt and of a biochemical plant for the production of ethanol are specified in the formalism $chi .$ A verification of the closed-loop systems for those examples,

  17. Microgravity Acceleration Measurement System (MAMS) Flight Configuration Verification and Status

    Science.gov (United States)

    Wagar, William

    2000-01-01

    The Microgravity Acceleration Measurement System (MAMS) is a precision spaceflight instrument designed to measure and characterize the microgravity environment existing in the US Lab Module of the International Space Station. Both vibratory and quasi-steady triaxial acceleration data are acquired and provided to an Ethernet data link. The MAMS Double Mid-Deck Locker (DMDL) EXPRESS Rack payload meets all the ISS IDD and ICD interface requirements as discussed in the paper which also presents flight configuration illustrations. The overall MAMS sensor and data acquisition performance and verification data are presented in addition to a discussion of the Command and Data Handling features implemented via the ISS, downlink and the GRC Telescience Center displays.

  18. Dual-use benefits of the CTBT verification system

    International Nuclear Information System (INIS)

    Meade, C.E.F.

    1999-01-01

    Since it has been completed in September 1996, the CTBT has been signed by 151 countries. Awaiting the 44 ratifications and entry into force, all of the nuclear powers have imposed unilateral moratoriums on nuclear test explosions. The end of these weapons development activities is often cited as the principal benefit of the CTBT. As the world begins to implement the Treaty, it has become clear that the development and operation of the CTBT verification system will provide a wide range of additional benefits if the data analysis products are available for dual-purpose applications. As this paper describes these could have economic and social implications, especially for countries with limited technical infrastructures. These involve, seismic monitoring, mineral exploration, scientific and technical training

  19. SWAT2: The improved SWAT code system by incorporating the continuous energy Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Okuno, Hiroshi

    2003-01-01

    SWAT is a code system, which performs the burnup calculation by the combination of the neutronics calculation code, SRAC95 and the one group burnup calculation code, ORIGEN2.1. The SWAT code system can deal with the cell geometry in SRAC95. However, a precise treatment of resonance absorptions by the SRAC95 code using the ultra-fine group cross section library is not directly applicable to two- or three-dimensional geometry models, because of restrictions in SRAC95. To overcome this problem, SWAT2 which newly introduced the continuous energy Monte Carlo code, MVP into SWAT was developed. Thereby, the burnup calculation by the continuous energy in any geometry became possible. Moreover, using the 147 group cross section library called SWAT library, the reactions which are not dealt with by SRAC95 and MVP can be treated. OECD/NEA burnup credit criticality safety benchmark problems Phase-IB (PWR, a single pin cell model) and Phase-IIIB (BWR, fuel assembly model) were calculated as a verification of SWAT2, and the results were compared with the average values of calculation results of burnup calculation code of each organization. Through two benchmark problems, it was confirmed that SWAT2 was applicable to the burnup calculation of the complicated geometry. (author)

  20. A new approach for the verification of optical systems

    Science.gov (United States)

    Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène

    2013-09-01

    Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Tra­ditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.

  1. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.

    2012-01-01

    of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  2. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  3. VALGALI: VERIFICATION AND VALIDATION TOOL FOR THE LIBRARIES PRODUCED BY THE GALILEE SYSTEM

    International Nuclear Information System (INIS)

    Mengelle, S.

    2011-01-01

    Full text: In this paper we present VALGALI the verification and validation tool for the libraries produced by the nuclear data processing system GALILEE. The aim of this system is to provide libraries with consistent physical data for various application codes (the deterministic transport code APOLLO2, the Monte Carlo transport code TRRIPOLI-4, the depletion code DARWIN, ...). For each library, are the data stored at the good place with the good format and so one. Are the libraries used by the various codes consistent. What is the physical quality of the cross sections and data present in the libraries. These three types of tests correspond to the classic stages of VandV. The great strength of VALGALI is to be generic and not dedicated to one application code Consequently, it is based on a common physical validation database which coverage is regularly increased. For all these test cases, the input data are declined for each relevant application code Moreover it can exist specific test case for each application code: At the present, VALGALI an check and validate the libraries of APOLLO2 and TRIPOLI4, but in the near future VALGALI wil also treat the libraries of DARWIN.

  4. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  5. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Poucet, A.; Contini, S.; Bellezza, F.

    2001-01-01

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  6. Verification and Validation of Flight-Critical Systems

    Science.gov (United States)

    Brat, Guillaume

    2010-01-01

    For the first time in many years, the NASA budget presented to congress calls for a focused effort on the verification and validation (V&V) of complex systems. This is mostly motivated by the results of the VVFCS (V&V of Flight-Critical Systems) study, which should materialize as a a concrete effort under the Aviation Safety program. This talk will present the results of the study, from requirements coming out of discussions with the FAA and the Joint Planning and Development Office (JPDO) to technical plan addressing the issue, and its proposed current and future V&V research agenda, which will be addressed by NASA Ames, Langley, and Dryden as well as external partners through NASA Research Announcements (NRA) calls. This agenda calls for pushing V&V earlier in the life cycle and take advantage of formal methods to increase safety and reduce cost of V&V. I will present the on-going research work (especially the four main technical areas: Safety Assurance, Distributed Systems, Authority and Autonomy, and Software-Intensive Systems), possible extensions, and how VVFCS plans on grounding the research in realistic examples, including an intended V&V test-bench based on an Integrated Modular Avionics (IMA) architecture and hosted by Dryden.

  7. Starting a simple IMRT verification system with Elekta Monaco and Elekta IVIEWGT Planner

    International Nuclear Information System (INIS)

    Ayala Lazaro, R.; Garcia Hernandez, M. J.; Gomez Cores, S.; Jimenez Rojas, R.; Sendon del Rio, J. R.; Polo Cezon, R.; Gomez Calvar, R.

    2013-01-01

    The use of electronic devices of image portal (EPID) is considered fast, effectively and without added cost of verification of static or dynamic IMRT treatments. Its implementation as a verification tool, however, can be quite complicated. Presents an easy way of setting up this system using the method of Lee et to the. and using Elekta MONACO Planner. (Author)

  8. ALGORITHM VERIFICATION FOR A TLD PERSONAL DOSIMETRY SYSTEM

    International Nuclear Information System (INIS)

    SHAHEIN, A.; SOLIMAN, H.A.; MAGHRABY, A.

    2008-01-01

    Dose algorithms are used in thermoluminescence personnel dosimetry for the interpretation of the dosimeter response in terms of equivalent dose. In the present study, an automated Harshaw 6600 reader was vigorously tested prior to the use for dose calculation algorithm according to the standard established by the US Department of Energy Laboratory Accreditation Program (DOELAP). Also, manual Harshaw 4500 reader was used along with the ICRU slab phantom and the RANDO phantom in experimentally determining the photon personal doses in terms of deep dose; Hp(10), shallow dose; Hp(0.07), and eye lens dose; Hp(3). Also, a Monte Carlo simulation program (VMC-dc) free code was used to simulate RANDO phantom irradiation process. The accuracy of the automated system lies well within DOELAP tolerance limits in all test categories

  9. Monte Carlo Alpha Iteration Algorithm for a Subcritical System Analysis

    Directory of Open Access Journals (Sweden)

    Hyung Jin Shim

    2015-01-01

    Full Text Available The α-k iteration method which searches the fundamental mode alpha-eigenvalue via iterative updates of the fission source distribution has been successfully used for the Monte Carlo (MC alpha-static calculations of supercritical systems. However, the α-k iteration method for the deep subcritical system analysis suffers from a gigantic number of neutron generations or a huge neutron weight, which leads to an abnormal termination of the MC calculations. In order to stably estimate the prompt neutron decay constant (α of prompt subcritical systems regardless of subcriticality, we propose a new MC alpha-static calculation method named as the α iteration algorithm. The new method is derived by directly applying the power method for the α-mode eigenvalue equation and its calculation stability is achieved by controlling the number of time source neutrons which are generated in proportion to α divided by neutron speed in MC neutron transport simulations. The effectiveness of the α iteration algorithm is demonstrated for two-group homogeneous problems with varying the subcriticality by comparisons with analytic solutions. The applicability of the proposed method is evaluated for an experimental benchmark of the thorium-loaded accelerator-driven system.

  10. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  11. Experimental verification of EGSnrc Monte Carlo calculated depth doses within a realistic parallel magnetic field in a polystyrene phantom.

    Science.gov (United States)

    Ghila, Andrei; Steciw, Stephen; Fallone, B Gino; Rathee, Satyapal

    2017-09-01

    Integrating a linac with a magnetic resonance imager (MRI) will revolutionize the accuracy of external beam radiation treatments. Irradiating in the presence of a strong magnetic field, however, will modify the dose distribution. These dose modifications have been investigated previously, mainly using Monte Carlo simulations. The purpose of this work is to experimentally verify the use of the EGSnrc Monte Carlo (MC) package for calculating percent depth doses (PDDs) in a homogeneous phantom, in the presence of a realistic parallel magnetic field. Two cylindrical electromagnets were used to produce a 0.207 T magnetic field parallel to the central axis of a 6 MV photon beam from a clinical linac. The magnetic field was measured at discrete points along orthogonal axes, and these measurements were used to validate a full 3D magnetic field map generated using COMSOL Multiphysics. Using a small parallel plate ion chamber, the depth dose was measured in a polystyrene phantom placed inside the electromagnet bore at two separate locations: phantom top surface coinciding with top of bore, and phantom top surface coinciding with center of bore. BEAMnrc MC was used to model the linac head which was benchmarked against the linac's commissioning measurements. The depth dose in polystyrene was simulated using DOSXYZnrc MC. For the magnetic field case, the DOSXYZnrc code was slightly modified to implement the previously calculated 3D magnetic field map to be used in the standard electromagnetic macros. The calculated magnetic field matched the measurements within 2% of the maximum central field (0.207 T) with most points within the experimental uncertainty (1.5%). For the MC linac head model, over 93% of all simulated points passed the 2%, 2 mm γ acceptance criterion, when comparing measured and simulated lateral beam and depth dose profiles. The parallel magnetic field caused a surface dose increase, compared to the no magnetic field case, due to the Lorentz force confining

  12. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  13. Clear-PEM system counting rates: a Monte Carlo study

    Science.gov (United States)

    Rodrigues, P.; Trindade, A.; Varela, J.

    2007-01-01

    Positron Emission Mammography (PEM) with 18F-Fluorodeoxyglucose (18F-FDG) is a functional imaging technique for breast cancer detection. The development of dedicated imaging systems with high sensitivity and spatial resolution are crucial for early breast cancer diagnosis and an efficient therapy. Clear-PEM is a dual planar scanner designed for high-resolution breast cancer imaging under development by the Portuguese PET Mammography consortium within the Crystal Clear Collaboration. It brings together a favorable combination of high-density scintillator crystals coupled to compact photodetectors, arranged in a double readout scheme capable of providing depth-of-interaction information. A Monte Carlo study of the Clear-PEM system counting rates is presented in this paper. Hypothetical breast exam scenarios were simulated to estimate the single event rates, true and random coincidence rates. A realistic description of the patient and detector geometry, radiation environment, physics and instrumentation factors was adopted in this work. Special attention was given to the 18F-FDG accumulation in the patient torso organs which, for the Clear-PEM scanner, represent significant activity outside the field-of-view (FOV) contributing to an increase of singles, randoms and scattered coincidences affecting the overall system performance. The potential benefits of patient shielding to minimize the influence of the out-of-field background was explored. The influence of LYSO:Ce crystal intrinsic natural activity due to the presence of the 176Lu isotope on the counting rate performance of the proposed scanner, was also investigated.

  14. Secure stand alone positive personnel identity verification system (SSA-PPIV)

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-03-01

    The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed

  15. Runtime verification of embedded real-time systems.

    Science.gov (United States)

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  16. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  17. Quantitative dosimetric verification of an IMRT planning and delivery system

    International Nuclear Information System (INIS)

    Low, D.A.; Mutic, S.; Dempsey, J.F.; Gerber, R.L.; Bosch, W.R.; Perez, C.A.; Purdy, J.A.

    1998-01-01

    Background and purpose: The accuracy of dose calculation and delivery of a commercial serial tomotherapy treatment planning and delivery system (Peacock, NOMOS Corporation) was experimentally determined. Materials and methods: External beam fluence distributions were optimized and delivered to test treatment plan target volumes, including three with cylindrical targets with diameters ranging from 2.0 to 6.2 cm and lengths of 0.9 through 4.8 cm, one using three cylindrical targets and two using C-shaped targets surrounding a critical structure, each with different dose distribution optimization criteria. Computer overlays of film-measured and calculated planar dose distributions were used to assess the dose calculation and delivery spatial accuracy. A 0.125 cm 3 ionization chamber was used to conduct absolute point dosimetry verification. Thermoluminescent dosimetry chips, a small-volume ionization chamber and radiochromic film were used as independent checks of the ion chamber measurements. Results: Spatial localization accuracy was found to be better than ±2.0 mm in the transverse axes (with one exception of 3.0 mm) and ±1.5 mm in the longitudinal axis. Dosimetric verification using single slice delivery versions of the plans showed that the relative dose distribution was accurate to ±2% within and outside the target volumes (in high dose and low dose gradient regions) with a mean and standard deviation for all points of -0.05% and 1.1%, respectively. The absolute dose per monitor unit was found to vary by ±3.5% of the mean value due to the lack of consideration for leakage radiation and the limited scattered radiation integration in the dose calculation algorithm. To deliver the prescribed dose, adjustment of the monitor units by the measured ratio would be required. Conclusions: The treatment planning and delivery system offered suitably accurate spatial registration and dose delivery of serial tomotherapy generated dose distributions. The quantitative dose

  18. Monte Carlo dose calculation algorithm on a distributed system

    International Nuclear Information System (INIS)

    Chauvie, Stephane; Dominoni, Matteo; Marini, Piergiorgio; Stasi, Michele; Pia, Maria Grazia; Scielzo, Giuseppe

    2003-01-01

    The main goal of modern radiotherapy, such as 3D conformal radiotherapy and intensity-modulated radiotherapy is to deliver a high dose to the target volume sparing the surrounding healthy tissue. The accuracy of dose calculation in a treatment planning system is therefore a critical issue. Among many algorithms developed over the last years, those based on Monte Carlo proven to be very promising in terms of accuracy. The most severe obstacle in application to clinical practice is the high time necessary for calculations. We have studied a high performance network of Personal Computer as a realistic alternative to a high-costs dedicated parallel hardware to be used routinely as instruments of evaluation of treatment plans. We set-up a Beowulf Cluster, configured with 4 nodes connected with low-cost network and installed MC code Geant4 to describe our irradiation facility. The MC, once parallelised, was run on the Beowulf Cluster. The first run of the full simulation showed that the time required for calculation decreased linearly increasing the number of distributed processes. The good scalability trend allows both statistically significant accuracy and good time performances. The scalability of the Beowulf Cluster system offers a new instrument for dose calculation that could be applied in clinical practice. These would be a good support particularly in high challenging prescription that needs good calculation accuracy in zones of high dose gradient and great dishomogeneities

  19. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)

  20. VERIFICATION OF THE FOOD SAFETY MANAGEMENT SYSTEM IN DEEP FROZEN FOOD PRODUCTION PLANT

    Directory of Open Access Journals (Sweden)

    Peter Zajác

    2010-07-01

    Full Text Available In work is presented verification of food safety management system of deep frozen food. Main emphasis is on creating set of verification questions within articles of standard STN EN ISO 22000:2006 and on searching of effectiveness in food safety management system. Information were acquired from scientific literature sources and they pointed out importance of implementation and upkeep of effective food safety management system. doi:10.5219/28

  1. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  2. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  3. Monte Carlo isotopic inventory analysis for complex nuclear systems

    Science.gov (United States)

    Phruksarojanakun, Phiphat

    Monte Carlo Inventory Simulation Engine (MCise) is a newly developed method for calculating isotopic inventory of materials. It offers the promise of modeling materials with complex processes and irradiation histories, which pose challenges for current, deterministic tools, and has strong analogies to Monte Carlo (MC) neutral particle transport. The analog method, including considerations for simple, complex and loop flows, is fully developed. In addition, six variance reduction tools provide unique capabilities of MCise to improve statistical precision of MC simulations. Forced Reaction forces an atom to undergo a desired number of reactions in a given irradiation environment. Biased Reaction Branching primarily focuses on improving statistical results of the isotopes that are produced from rare reaction pathways. Biased Source Sampling aims at increasing frequencies of sampling rare initial isotopes as the starting particles. Reaction Path Splitting increases the population by splitting the atom at each reaction point, creating one new atom for each decay or transmutation product. Delta Tracking is recommended for high-frequency pulsing to reduce the computing time. Lastly, Weight Window is introduced as a strategy to decrease large deviations of weight due to the uses of variance reduction techniques. A figure of merit is necessary to compare the efficiency of different variance reduction techniques. A number of possibilities for figure of merit are explored, two of which are robust and subsequently used. One is based on the relative error of a known target isotope (1/R 2T) and the other on the overall detection limit corrected by the relative error (1/DkR 2T). An automated Adaptive Variance-reduction Adjustment (AVA) tool is developed to iteratively define parameters for some variance reduction techniques in a problem with a target isotope. Sample problems demonstrate that AVA improves both precision and accuracy of a target result in an efficient manner

  4. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.

    2016-11-29

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  5. Clear-PEM system counting rates: a Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, P [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal); Trindade, A [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal); Varela, J [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal)

    2007-01-15

    Positron Emission Mammography (PEM) with {sup 18}F-Fluorodeoxyglucose ({sup 18}F-FDG) is a functional imaging technique for breast cancer detection. The development of dedicated imaging systems with high sensitivity and spatial resolution are crucial for early breast cancer diagnosis and an efficient therapy. Clear-PEM is a dual planar scanner designed for high-resolution breast cancer imaging under development by the Portuguese PET Mammography consortium within the Crystal Clear Collaboration. It brings together a favorable combination of high-density scintillator crystals coupled to compact photodetectors, arranged in a double readout scheme capable of providing depth-of-interaction information. A Monte Carlo study of the Clear-PEM system counting rates is presented in this paper. Hypothetical breast exam scenarios were simulated to estimate the single event rates, true and random coincidence rates. A realistic description of the patient and detector geometry, radiation environment, physics and instrumentation factors was adopted in this work. Special attention was given to the {sup 18}F-FDG accumulation in the patient torso organs which, for the Clear-PEM scanner, represent significant activity outside the field-of-view (FOV) contributing to an increase of singles, randoms and scattered coincidences affecting the overall system performance. The potential benefits of patient shielding to minimize the influence of the out-of-field background was explored. The influence of LYSO:Ce crystal intrinsic natural activity due to the presence of the {sup 176}Lu isotope on the counting rate performance of the proposed scanner, was also investigated.

  6. Efficiencies of dynamic Monte Carlo algorithms for off-lattice particle systems with a single impurity

    KAUST Repository

    Novotny, M.A.

    2010-02-01

    The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing Markov Chains method are given. Simulation results are presented to confirm the theoretical efficiencies. © 2010.

  7. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: NEW CONDENSATOR, INC.--THE CONDENSATOR DIESEL ENGINE RETROFIT CRANKCASE VENTILATION SYSTEM

    Science.gov (United States)

    EPA's Environmental Technology Verification Program has tested New Condensator Inc.'s Condensator Diesel Engine Retrofit Crankcase Ventilation System. Brake specific fuel consumption (BSFC), the ratio of engine fuel consumption to the engine power output, was evaluated for engine...

  9. In pursuit of carbon accountability: the politics of REDD+ measuring, reporting and verification systems

    NARCIS (Netherlands)

    Gupta, A.; Lövbrand, E.; Turnhout, E.; Vijge, M.J.

    2012-01-01

    This article reviews critical social science analyses of carbonaccounting and monitoring, reporting and verification (MRV) systems associated with reducing emissions from deforestation, forest degradation and conservation, sustainable use and enhancement of forest carbon stocks (REDD+). REDD+ MRV

  10. An Analysis of Specware and Its Usefulness in the Verification of High Assurance Systems

    National Research Council Canada - National Science Library

    DeCloss, Daniel P

    2006-01-01

    .... A verification system consists of a specification language that can express formal logic and an automated theorem tool that can be used to verify theorems and conjectures within the specifications...

  11. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...

  12. Verification of organ doses calculated by a dose monitoring software tool based on Monte Carlo Simulation in thoracic CT protocols.

    Science.gov (United States)

    Guberina, Nika; Suntharalingam, Saravanabavaan; Naßenstein, Kai; Forsting, Michael; Theysohn, Jens; Wetter, Axel; Ringelstein, Adrian

    2018-03-01

    Background The importance of monitoring of the radiation dose received by the human body during computed tomography (CT) examinations is not negligible. Several dose-monitoring software tools emerged in order to monitor and control dose distribution during CT examinations. Some software tools incorporate Monte Carlo Simulation (MCS) and allow calculation of effective dose and organ dose apart from standard dose descriptors. Purpose To verify the results of a dose-monitoring software tool based on MCS in assessment of effective and organ doses in thoracic CT protocols. Material and Methods Phantom measurements were performed with thermoluminescent dosimeters (TLD LiF:Mg,Ti) using two different thoracic CT protocols of the clinical routine: (I) standard CT thorax (CTT); and (II) CTT with high-pitch mode, P = 3.2. Radiation doses estimated with MCS and measured with TLDs were compared. Results Inter-modality comparison showed an excellent correlation between MCS-simulated and TLD-measured doses ((I) after localizer correction r = 0.81; (II) r = 0.87). The following effective and organ doses were determined: (I) (a) effective dose = MCS 1.2 mSv, TLD 1.3 mSv; (b) thyroid gland = MCS 2.8 mGy, TLD 2.5 mGy; (c) thymus = MCS 3.1 mGy, TLD 2.5 mGy; (d) bone marrow = MCS 0.8 mGy, TLD 0.9 mGy; (e) breast = MCS 2.5 mGy, TLD 2.2 mGy; (f) lung = MCS 2.8 mGy, TLD 2.7 mGy; (II) (a) effective dose = MCS 0.6 mSv, TLD 0.7 mSv; (b) thyroid gland = MCS 1.4 mGy, TLD 1.8 mGy; (c) thymus = MCS 1.4 mGy, TLD 1.8 mGy; (d) bone marrow = MCS 0.4 mGy, TLD 0.5 mGy; (e) breast = MCS 1.1 mGy, TLD 1.1 mGy; (f) lung = MCS 1.2 mGy, TLD 1.3 mGy. Conclusion Overall, in thoracic CT protocols, organ doses simulated by the dose-monitoring software tool were coherent to those measured by TLDs. Despite some challenges, the dose-monitoring software was capable of an accurate dose calculation.

  13. Television system for verification and documentation of treatment fields during intraoperative radiation therapy

    International Nuclear Information System (INIS)

    Fraass, B.A.; Harrington, F.S.; Kinsella, T.J.; Sindelar, W.F.

    1983-01-01

    Intraoperative radiation therapy (IORT) involves direct treatment of tumors or tumor beds with large single doses of radiation. The verification of the area to be treated before irradiation and the documentation of the treated area are critical for IORT, just as for other types of radiation therapy. A television system which allows the target area to be directly imaged immediately before irradiation has been developed. Verification and documentation of treatment fields has made the IORT television system indispensable

  14. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  15. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  16. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  17. Verification and synthesis of optimal decision strategies for complex systems

    International Nuclear Information System (INIS)

    Summers, S. J.

    2013-01-01

    Complex systems make a habit of disagreeing with the mathematical models strategically designed to capture their behavior. A recursive process ensues where data is used to gain insight into the disagreement. A simple model may give way to a model with hybrid dynamics. A deterministic model may give way to a model with stochastic dynamics. In many cases, the modeling framework that sufficiently characterises the system is both hybrid and stochastic; these systems are referred to as stochastic hybrid systems. This dissertation considers the stochastic hybrid system framework for modeling complex systems and provides mathematical methods for analysing, and synthesizing decision laws for, such systems. We first propose a stochastic reach-avoid problem for discrete time stochastic hybrid systems. In particular, we present a dynamic programming based solution to a probabilistic reach-avoid problem for a controlled discrete time stochastic hybrid system. We address two distinct interpretations of the reach-avoid problem via stochastic optimal control. In the first case, a sum-multiplicative cost function is introduced along with a corresponding dynamic recursion that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an unsafe set at all preceding time steps. In the second case, we introduce a multiplicative cost function and a dynamic recursion that quantifies the probability of hitting a target set at the terminal time, while avoiding an unsafe set at all preceding time steps. In each case, optimal reach-avoid control policies are derived as the solution to an optimal control problem via dynamic programming. We next introduce an extension of the reach-avoid problem where we consider the verification of discrete time stochastic hybrid systems when there exists uncertainty in the reachability specifications themselves. A sum multiplicative cost function is introduced along with a corresponding dynamic recursion

  18. Compositional Verification of Interlocking Systems for Large Stations

    DEFF Research Database (Denmark)

    Fantechi, Alessandro; Haxthausen, Anne Elisabeth; Macedo, Hugo Daniel dos Santos

    2017-01-01

    -network: in this way granting the access to a route is essentially a decision local to the sub-network, and the interfaces with the rest of the network easily abstract away less interesting details related to the external world. Following up on previous work, where we defined a compositional verification method...

  19. Six types Monte Carlo for estimating the current unavailability of Markov system with dependent repair

    International Nuclear Information System (INIS)

    Xiao Gang; Li Zhizhong

    2004-01-01

    Based on integral equaiton describing the life-history of Markov system, six types of estimators of the current unavailability of Markov system with dependent repair are propounded. Combining with the biased sampling of state transition time of system, six types of Monte Carlo for estimating the current unavailability are given. Two numerical examples are given to deal with the variances and efficiencies of the six types of Monte Carlo methods. (authors)

  20. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  1. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  2. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  3. Dynamic Isotope Power System: technology verification phase, program plan, 1 October 1978

    International Nuclear Information System (INIS)

    1979-01-01

    The technology verification phase program plan of the Dynamic Isotope Power System (DIPS) project is presented. DIPS is a project to develop a 0.5 to 2.0 kW power system for spacecraft using an isotope heat source and a closed-cycle Rankine power-system with an organic working fluid. The technology verification phase's purposes are to increase the system efficiency to over 18%, to demonstrate system reliability, and to provide an estimate for flight test scheduling. Progress toward these goals is reported

  4. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  5. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  6. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  7. Digital system verification a combined formal methods and simulation framework

    CERN Document Server

    Li, Lun

    2010-01-01

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5

  8. Cooling Tower (Evaporative Cooling System) Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Boyd, Brian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lewis, Taylor [Colorado Energy Office, Denver, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with cooling tower efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  9. The inverse method parametric verification of real-time embedded systems

    CERN Document Server

    André , Etienne

    2013-01-01

    This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv

  10. TLM.open: a SystemC/TLM Frontend for the CADP Verification Toolbox

    Directory of Open Access Journals (Sweden)

    Claude Helmstetter

    2014-04-01

    Full Text Available SystemC/TLM models, which are C++ programs, allow the simulation of embedded software before hardware low-level descriptions are available and are used as golden models for hardware verification. The verification of the SystemC/TLM models is an important issue since an error in the model can mislead the system designers or reveal an error in the specifications. An open-source simulator for SystemC/TLM is provided but there are no tools for formal verification.In order to apply model checking to a SystemC/TLM model, a semantics for standard C++ code and for specific SystemC/TLM features must be provided. The usual approach relies on the translation of the SystemC/TLM code into a formal language for which a model checker is available.We propose another approach that suppresses the error-prone translation effort. Given a SystemC/TLM program, the transitions are obtained by executing the original code using g++ and an extended SystemC library, and we ask the user to provide additional functions to store the current model state. These additional functions generally represent less than 20% of the size of the original model, and allow it to apply all CADP verification tools to the SystemC/TLM model itself.

  11. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    , the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified......-TA-per-instance line” approach, and then reduce the problems of scenario based verification also to CTL real-time model checking problems. We show how we exploit the expressivity of the TA formalism and the CTL query language of the realtime model checker UPPAAL to accomplish these tasks. The proposed two approaches...

  12. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  13. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor...

  14. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  15. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    2007-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  16. Proceedings of the 7th International Workshop on Verification of Infinite-State Systems (INFINITY'05)

    DEFF Research Database (Denmark)

    2005-01-01

    The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...

  17. ATLANTIDES: Automatic Configuration for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, B.; Etalle, Sandro

    2008-01-01

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  18. ATLANTIDES: An Architecture for Alert Verification in Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Crispo, Bruno; Etalle, Sandro

    We present an architecture designed for alert verification (i.e., to reduce false positives) in network intrusion-detection systems. Our technique is based on a systematic (and automatic) anomaly-based analysis of the system output, which provides useful context information regarding the network

  19. A Formal Approach for the Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian

    2011-01-01

    This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed...

  20. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  1. Formal verification and validation of the safety-critical software in a digital reactor protection system

    International Nuclear Information System (INIS)

    Kwon, K. C.; Park, G. Y.

    2006-01-01

    This paper describes the Verification and Validation (V and V) activities for the safety-critical software in a Digital Reactor Protection System (DRPS) that is being developed through the Korea nuclear instrumentation and control system project. The main activities of the DRPS V and V process are a preparation of the software planning documentation, a verification of the software according to the software life cycle, a software safety analysis and a software configuration management. The verification works for the Software Requirement Specification (SRS) of the DRPS consist of a technical evaluation, a licensing suitability evaluation, a inspection and traceability analysis, a formal verification, and preparing a test plan and procedure. Especially, the SRS is specified by the formal specification method in the development phase, and the formal SRS is verified by a formal verification method. Through these activities, we believe we can achieve the functionality, performance, reliability, and safety that are the major V and V objectives of the nuclear safety-critical software in a DRPS. (authors)

  2. MONTE CARLO SIMULATION OF MULTIFOCAL STOCHASTIC SCANNING SYSTEM

    Directory of Open Access Journals (Sweden)

    LIXIN LIU

    2014-01-01

    Full Text Available Multifocal multiphoton microscopy (MMM has greatly improved the utilization of excitation light and imaging speed due to parallel multiphoton excitation of the samples and simultaneous detection of the signals, which allows it to perform three-dimensional fast fluorescence imaging. Stochastic scanning can provide continuous, uniform and high-speed excitation of the sample, which makes it a suitable scanning scheme for MMM. In this paper, the graphical programming language — LabVIEW is used to achieve stochastic scanning of the two-dimensional galvo scanners by using white noise signals to control the x and y mirrors independently. Moreover, the stochastic scanning process is simulated by using Monte Carlo method. Our results show that MMM can avoid oversampling or subsampling in the scanning area and meet the requirements of uniform sampling by stochastically scanning the individual units of the N × N foci array. Therefore, continuous and uniform scanning in the whole field of view is implemented.

  3. Methods for identification and verification using vacuum XRF system

    Science.gov (United States)

    Schramm, Fred (Inventor); Kaiser, Bruce (Inventor)

    2005-01-01

    Apparatus and methods in which one or more elemental taggants that are intrinsically located in an object are detected by x-ray fluorescence analysis under vacuum conditions to identify or verify the object's elemental content for elements with lower atomic numbers. By using x-ray fluorescence analysis, the apparatus and methods of the invention are simple and easy to use, as well as provide detection by a non line-of-sight method to establish the origin of objects, as well as their point of manufacture, authenticity, verification, security, and the presence of impurities. The invention is extremely advantageous because it provides the capability to measure lower atomic number elements in the field with a portable instrument.

  4. Evaluation of an electrocardiograph-based PICC tip verification system.

    Science.gov (United States)

    Oliver, Gemma; Jones, Matt

    Performing a chest x-ray after insertion of a peripherally inserted central catheter (PICC) is recognised as the gold standard for checking that the tip of the catheter is correctly positioned in the lower third of the superior vena cava at the right atrial junction; however, numerous problems are associated with this practice. A recent technological advancement has been developed that utilises changes in a patient's electrocardiograph (ECG) recorded from the tip of the PICC as a more reliable method. This evaluation discusses how a vascular access team in a large acute NHS Trust safely and successfully incorporated the use of ECG guidance technology for verification of PICC tip placement into their practice.

  5. NI PXI-Based Automated Measurement System for Digital ASICs Verification

    Directory of Open Access Journals (Sweden)

    Sorokoumov Georgiy

    2016-01-01

    Full Text Available The paper describes a structure of the automated measuring system used to control digital ASICs electrical and functional parameters. The automated measuring system is based on National Instruments PXI modules. The PXI-7954R module is the most significant module in the system. Hardware and software operations of the measuring system are discussed in the paper. The measuring system is based on test vectors for digital ASICs functional verification.

  6. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    Science.gov (United States)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  7. A complementary dual-modality verification for tumor tracking on a gimbaled linac system.

    Science.gov (United States)

    Poels, Kenneth; Depuydt, Tom; Verellen, Dirk; Engels, Benedikt; Collen, Christine; Heinrich, Steffen; Duchateau, Michael; Reynders, Truus; Leysen, Katrien; Boussaer, Marlies; Steenbeke, Femke; Tournel, Koen; Gevaert, Thierry; Storme, Guy; De Ridder, Mark

    2013-12-01

    For dynamic tracking of moving tumors, robust intra-fraction verification was required, to assure that tumor motion was properly managed during the course of radiotherapy. A dual-modality verification system, consisting of an on-board orthogonal kV and planar MV imaging device, was validated and applied retrospectively to patient data. Real-time tumor tracking (RTTT) was managed by applying PAN and TILT angular corrections to the therapeutic beam using a gimbaled linac. In this study, orthogonal X-ray imaging and MV EPID fluoroscopy was acquired simultaneously. The tracking beam position was derived from respectively real-time gimbals log files and the detected field outline on EPID. For both imaging modalities, the moving target was localized by detection of an implanted fiducial. The dual-modality tracking verification was validated against a high-precision optical camera in phantom experiments and applied to clinical tracking data from a liver and two lung cancer patients. Both verification modalities showed a high accuracy (tracking showed a 90th percentile error (E90) of 3.45 (liver), 2.44 (lung A) and 3.40 mm (lung B) based on EPID fluoroscopy and good agreement with XR-log file data by an E90 of 3.13, 1.92 and 3.33 mm, respectively, during beam on. Dual-modality verification was successfully implemented, offering the possibility of detailed reporting on RTTT performance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  10. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  11. Towards the Formal Verification of a Distributed Real-Time Automotive System

    Science.gov (United States)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  12. RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems

    OpenAIRE

    Pecheur, Charles; Visser, Willem; Simmons, Reid

    2001-01-01

    The long-term future of space exploration at the National Aeronautics and Space Administration (NASA) is dependent on the full exploitation of autonomous and adaptive systems, but mission managers are worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries; hence, we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation of software systems. The dual pu...

  13. Monte Carlo simulations of lattice models for single polymer systems

    International Nuclear Information System (INIS)

    Hsu, Hsiao-Ping

    2014-01-01

    Single linear polymer chains in dilute solutions under good solvent conditions are studied by Monte Carlo simulations with the pruned-enriched Rosenbluth method up to the chain length N∼O(10 4 ). Based on the standard simple cubic lattice model (SCLM) with fixed bond length and the bond fluctuation model (BFM) with bond lengths in a range between 2 and √(10), we investigate the conformations of polymer chains described by self-avoiding walks on the simple cubic lattice, and by random walks and non-reversible random walks in the absence of excluded volume interactions. In addition to flexible chains, we also extend our study to semiflexible chains for different stiffness controlled by a bending potential. The persistence lengths of chains extracted from the orientational correlations are estimated for all cases. We show that chains based on the BFM are more flexible than those based on the SCLM for a fixed bending energy. The microscopic differences between these two lattice models are discussed and the theoretical predictions of scaling laws given in the literature are checked and verified. Our simulations clarify that a different mapping ratio between the coarse-grained models and the atomistically realistic description of polymers is required in a coarse-graining approach due to the different crossovers to the asymptotic behavior

  14. Preface of Special issue on Automated Verification of Critical Systems (AVoCS'14)

    NARCIS (Netherlands)

    Huisman, Marieke; van de Pol, Jaco

    2016-01-01

    AVoCS 2014, the 14th International Conference on Automated Verification of Critical Systems has been hosted by the University of Twente, and has taken place in Enschede, Netherlands, on 24–26 September, 2014. The aim of the AVoCS series is to contribute to the interaction and exchange of ideas among

  15. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  16. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  17. Safety verification of non-linear hybrid systems is quasi-decidable

    Czech Academy of Sciences Publication Activity Database

    Ratschan, Stefan

    2014-01-01

    Roč. 44, č. 1 (2014), s. 71-90 ISSN 0925-9856 R&D Projects: GA ČR GCP202/12/J060 Institutional support: RVO:67985807 Keywords : hybrid systems * safety verification * decidability * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 0.875, year: 2014

  18. Mathematical verification of a nuclear power plant protection system function with combined CPN and PVS

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun

    1999-01-01

    In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for system modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, an information extractor from CPN models has been developed in this work. In order to convert the extracted information to the PVS specification language, a translator also has been developed. ML that is a higher-order functional language programs the information extractor and translator. This combined method has been applied to a protection system function of Wolsung NPP SDS2 (Steam Generator Low Level Trip). As a result of this application, we could prove completeness and consistency of the requirement logically. Through this work, in short, an axiom or lemma based-analysis method for CPN models is newly suggested in order to complement CPN analysis methods and a guideline for the use of formal methods is proposed in order to apply them to NPP software verification and validation. (author). 9 refs., 15 figs

  19. A new verification film system for routine quality control of radiation fields: Kodak EC-L

    International Nuclear Information System (INIS)

    Hermann, A.; Bratengeier, K.; Priske, A.; Flentje, M.

    2000-01-01

    Background: The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. Material and Methods: For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. Results: In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged 'good', only 18% were classified 'moderate' or 'poor' 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be 'good'. Conclusions: The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated. (orig.) [de

  20. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation

    DEFF Research Database (Denmark)

    Wendt, Fabian F.; Yu, Yi-Hsiang; Nielsen, Kim

    2017-01-01

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 ...

  1. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    NARCIS (Netherlands)

    Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20

  2. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  3. Application of plutonium inventory measurement system (PIMS) and temporary canister verification system (TCVS) at RRP

    International Nuclear Information System (INIS)

    Noguchi, Yoshihiko; Nakamura, Hironobu; Adachi, Hideto; Iwamoto, Tomonori

    2004-01-01

    In U-Pu co-denitration area at Rokkasho Reprocessing Plant (RRP), Plutonium Inventory Measurement System (PIMS) and Temporary Canister Verification System (TCVS) are installed to provide efficient and effective safeguards. PIMS measures Pu quantity inside pipes and vessels installed in glove boxes by total neutron counting method. PIMS consists of total 142 neutron detector attached on the wall and top of glove boxes and neutron count rates of each detectors are related to each other to calculate Pu quantity of each process areas. In this moment, inactive calibration using Cf-source was completed. On the other hand, TCVS measures Pu quantity of canisters inside temporary storage by coincidence counting method and it will be installed before the active test. These systems have monitoring function as additional measures. This paper describes specification, performance and measurement principles of PIMS and TCVS. (author)

  4. Quasi-Monte Carlo methods for lattice systems. A first look

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Leovey, H.; Griewank, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Mathematik; Nube, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Mueller-Preussker, M. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2013-02-15

    We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N{sup -1/2}, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N{sup -1}. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.

  5. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  6. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  7. Verification of the safety communication protocol in train control system using colored Petri net

    International Nuclear Information System (INIS)

    Chen Lijie; Tang Tao; Zhao Xianqiong; Schnieder, Eckehard

    2012-01-01

    This paper deals with formal and simulation-based verification of the safety communication protocol in ETCS (European Train Control System). The safety communication protocol controls the establishment of safety connection between train and trackside. Because of its graphical user interface and modeling flexibility upon the changes in the system conditions, this paper proposes a composition Colored Petri Net (CPN) representation for both the logic and the timed model. The logic of the protocol is proved to be safe by means of state space analysis: the dead markings are correct; there are no dead transitions; being fair. Further analysis results have been obtained using formal and simulation-based verification approach. The timed models for the open transmit system and the application process are created for the purpose of performance analysis of the safety communication protocol. The models describe the procedure of data transmission and processing, and also provide relevant timed and stochastic factors, as well as time delay and lost packet, which may influence the time for establishment of safety connection of the protocol. Time for establishment of safety connection of the protocol in normal state is verified by formal verification, and then time for establishment of safety connection with different probability of lost packet is simulated. After verification it is found that the time for establishment of safety connection of the safety communication protocol satisfies the safety requirements.

  8. New concept of a range verification system for proton therapy using a photon counting detector

    International Nuclear Information System (INIS)

    Kim, Jin Sung; An, Su Jung; Chung, Yong Hyun

    2012-01-01

    A range verification method plays an important role in the quality assurance of the proton therapy offering the high conformity and reduction in radiation dose. To localize the distal falloff of the dose distribution, secondary particles (C-11, O-15, and N-13) produced by the proton interaction within the patient body can be used as a measure of the beam range. We proposed a multi-modality imaging system for X-ray and gamma-ray coincidence imaging using CdZnTe detectors to measure proton range verification. The detector system consists of two parallel planes of detectors and an X-ray generator. An X-ray image is acquired using one detector for the verification of 2-dimensional anatomical structure of the patient, and the paired gamma rays from the annihilation are imaged with two modules to determine the maximum range of proton penetration. Image registration is intrinsic because the X-ray and gamma ray images are acquired in the same geometry. 110 and 140 MeV proton beam, a cylindrical tissue phantom, and two rectangular CdZnTe detectors were modeled, and the imaging performance of this system was evaluated using GATE simulation. The results showed the potential benefits of an X-ray/gamma-ray imaging with photon counting detectors for range verification in proton therapy.

  9. Monte Carlo-derived TLD cross-calibration factors for treatment verification and measurement of skin dose in accelerated partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Garnica-Garza, H M [Centro de Investigacion y de Estudios Avanzados del Instituto Politecnico Nacional Unidad Monterrey, VIa del Conocimiento 201 Parque de Investigacion e Innovacion Tecnologica, Apodaca NL C.P. 66600 (Mexico)], E-mail: hgarnica@cinvestav.mx

    2009-03-21

    Monte Carlo simulation was employed to calculate the response of TLD-100 chips under irradiation conditions such as those found during accelerated partial breast irradiation with the MammoSite radiation therapy system. The absorbed dose versus radius in the last 0.5 cm of the treated volume was also calculated, employing a resolution of 20 {mu}m, and a function that fits the observed data was determined. Several clinically relevant irradiation conditions were simulated for different combinations of balloon size, balloon-to-surface distance and contents of the contrast solution used to fill the balloon. The thermoluminescent dosemeter (TLD) cross-calibration factors were derived assuming that the calibration of the dosemeters was carried out using a Cobalt 60 beam, and in such a way that they provide a set of parameters that reproduce the function that describes the behavior of the absorbed dose versus radius curve. Such factors may also prove to be useful for those standardized laboratories that provide postal dosimetry services.

  10. A Tool for Automatic Verification of Real-Time Expert Systems

    Science.gov (United States)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  11. Development of array-type prompt gamma measurement system for in vivo range verification in proton therapy.

    Science.gov (United States)

    Min, Chul Hee; Lee, Han Rim; Kim, Chan Hyeong; Lee, Se Byeong

    2012-04-01

    In vivo range verification is one of the most important parts of proton therapy to fully utilize its benefits delivering high radiation dose to tumor, while sparing the normal tissue with the so-called Bragg peak. Currently, however, range verification method is not used in clinics. The purpose of the present study is to optimize and evaluate the configuration of an array-type prompt gamma measurement system on determining distal dose edge for in vivo range verification of proton therapy. To effectively measure the prompt gammas against the background gammas, the Monte Carlo simulations with the MCNPX code were employed in optimizing the configuration of the measurement system, and the Monte Carlo method was also used to understand the effect of the background gammas, mainly neutron capture gammas, in the measured gamma distribution. To reduce the effect of the background gammas, the optimized energy window of 4-10 MeV in measuring the prompt gammas was employed. A parameterized source was used to maximize computation speed in the optimization study. A simplified test measurement system, using only one detector moving from one measurement location to the next, was constructed and applied to therapeutic proton beams of 80-220 MeV. For accurate determination of the distal dose edge, the sigmoidal curve-fitting method was applied to the measured distributions of the prompt gammas, and then, the location of the half-value between the maximum and minimum value in the curve-fitting was determined as the distal dose edge and compared with the beam range assessed by the proton dose distribution. The parameterized source term employed in optimization process improved the calculation speed by up to ∼300 times. The optimization study indicates that an array-type measurement system with 3, 2, 2, and 150 mm for scintillator thickness, slit width, septal thickness, and slit length, respectively, can effectively measure the prompt gamma distributions minimizing the contribution

  12. The performance of a hybrid analytical-Monte Carlo system response matrix in pinhole SPECT reconstruction

    International Nuclear Information System (INIS)

    El Bitar, Z; Pino, F; Candela, C; Ros, D; Pavía, J; Rannou, F R; Ruibal, A; Aguiar, P

    2014-01-01

    It is well-known that in pinhole SPECT (single-photon-emission computed tomography), iterative reconstruction methods including accurate estimations of the system response matrix can lead to submillimeter spatial resolution. There are two different methods for obtaining the system response matrix: those that model the system analytically using an approach including an experimental characterization of the detector response, and those that make use of Monte Carlo simulations. Methods based on analytical approaches are faster and handle the statistical noise better than those based on Monte Carlo simulations, but they require tedious experimental measurements of the detector response. One suggested approach for avoiding an experimental characterization, circumventing the problem of statistical noise introduced by Monte Carlo simulations, is to perform an analytical computation of the system response matrix combined with a Monte Carlo characterization of the detector response. Our findings showed that this approach can achieve high spatial resolution similar to that obtained when the system response matrix computation includes an experimental characterization. Furthermore, we have shown that using simulated detector responses has the advantage of yielding a precise estimate of the shift between the point of entry of the photon beam into the detector and the point of interaction inside the detector. Considering this, it was possible to slightly improve the spatial resolution in the edge of the field of view. (paper)

  13. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan; FINAL

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  14. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    CLARK, D.L.

    1999-01-01

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  15. Acceptance and implementation of a system of planning computerized based on Monte Carlo

    International Nuclear Information System (INIS)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-01-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  16. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    RESONANCE ⎜ August 2014. GENERAL ⎜ ARTICLE. Variational Monte Carlo Technique. Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. Keywords. Variational methods, Monte. Carlo techniques, harmonic os- cillators, quantum mechanical systems. Sukanta Deb is an. Assistant Professor in the.

  17. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    Science.gov (United States)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  18. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  19. Validation for application of the Monte Carlo simulation code for 235U mass content verification for large size samples of nuclear materials

    Directory of Open Access Journals (Sweden)

    M.S. El Tahawy

    2014-03-01

    Full Text Available In this work, a new semi- absolute non-destructive assay technique has been developed to verify the mass content of 235U in the large sizes nuclear material samples of different enrichment through combination of experimental measurements and Mont Carlo calculations (version MCNP5. A good agreement was found between the calculated and declared values of the mass content of 235U of uranium oxide (UO2 samples. The results obtained from Mont Carlo calculations showed that the semi-absolute technique can be used with sufficient reliability to verify the uranium mass content in the large sizes nuclear material samples of different enrichment.

  20. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Folkerts, M; Graves, Y; Tian, Z; Gu, X; Jia, X; Jiang, S

    2014-01-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  1. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  2. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    Science.gov (United States)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  3. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2

    Energy Technology Data Exchange (ETDEWEB)

    PARSONS, J.E.

    2000-07-15

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  4. Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection

    International Nuclear Information System (INIS)

    Muhammad Subekti

    2009-01-01

    Sensitivity Verification of PWR Monitoring System Using Neuro-Expert For LOCA Detection. The present research was done for verification of previous developed method on Loss of Coolant Accident (LOCA) detection and perform simulations for knowing the sensitivity of the PWR monitoring system that applied neuro-expert method. The previous research continuing on present research, has developed and has tested the neuro-expert method for several anomaly detections in Nuclear Power Plant (NPP) typed Pressurized Water Reactor (PWR). Neuro-expert can detect the LOCA anomaly with sensitivity of primary coolant leakage of 7 gallon/min and the conventional method could not detect the primary coolant leakage of 30 gallon/min. Neuro expert method detects significantly LOCA anomaly faster than conventional system in Surry-1 NPP as well so that the impact risk is reducible. (author)

  5. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    CERN Document Server

    Parsons, J E

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.

  6. Advanced control and instrumentation systems in nuclear power plants. Design, verification and validation

    International Nuclear Information System (INIS)

    Haapanen, P.

    1995-01-01

    The Technical Committee Meeting on design, verification and validation of advanced control and instrumentation systems in nuclear power plants was held in Espoo, Finland on 20 - 23 June 1994. The meeting was organized by the International Atomic Energy Agency's (IAEA) International Working Group's (IWG) on Nuclear Power Plant Control and Instrumentation (NPPCI) and on Advanced Technologies for Water Cooled Reactors (ATWR). VTT Automation together with Imatran Voima Oy and Teollisuuden Voima Oy responded about the practical arrangements of the meeting. In total 96 participants from 21 countries and the Agency took part in the meeting and 34 full papers and 8 posters were presented. Following topics were covered in the papers: (1) experience with advanced and digital systems, (2) safety and reliability analysis, (3) advanced digital systems under development and implementation, (4) verification and validation methods and practices, (5) future development trends. (orig.)

  7. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Tsao, Jeffrey Y. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trucano, Timothy G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kleban, Stephen D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naugle, Asmeret Bier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Curtis M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Mark A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flanagan, Tatiana Paz [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vugrin, Eric D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gabert, Kasimir Georg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lave, Matthew Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Chen, Wei [Northwestern Univ., Evanston, IL (United States); DeLaurentis, Daniel [Purdue Univ., West Lafayette, IN (United States); Hubler, Alfred [Univ. of Illinois, Urbana, IL (United States); Oberkampf, Bill [WLO Consulting, Austin, TX (United States)

    2016-08-01

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?

  8. Verification of a climate convention: role and limitations of a space based remote sensing system

    International Nuclear Information System (INIS)

    Lanchbery, J.F.; Fischer, W.; Katscher, R.; Stein, G.

    1992-01-01

    Techniques currently under discussion for verifying compliance with an international climate convention and its protocols include space-based remoted sensing systems. This paper indicates present and potential verification sectors for such systems together with the likely technical demands of the verification system user. Space-based remote sensors currently offer the prospect of being used to monitor changes in vegetation on the Earth's surface and could thus be used as tools to verify compliance with a forest protocol and possible land-use or agriculture agreements. After discussing the capabilities and limitations of currently available remote sensing techniques in this type of application, some of the technical requirements of remote sensors in other areas likely to need verifying under a climate convention are explored. 10 refs., 2 tabs

  9. Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2

    International Nuclear Information System (INIS)

    PARSONS, J.E.

    2000-01-01

    The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented

  10. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    Science.gov (United States)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  11. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    Science.gov (United States)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  12. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality,

  13. Fault Risk Assessment of Underwater Vehicle Steering System Based on Virtual Prototyping and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    He Deyu

    2016-09-01

    Full Text Available Assessing the risks of steering system faults in underwater vehicles is a human-machine-environment (HME systematic safety field that studies faults in the steering system itself, the driver’s human reliability (HR and various environmental conditions. This paper proposed a fault risk assessment method for an underwater vehicle steering system based on virtual prototyping and Monte Carlo simulation. A virtual steering system prototype was established and validated to rectify a lack of historic fault data. Fault injection and simulation were conducted to acquire fault simulation data. A Monte Carlo simulation was adopted that integrated randomness due to the human operator and environment. Randomness and uncertainty of the human, machine and environment were integrated in the method to obtain a probabilistic risk indicator. To verify the proposed method, a case of stuck rudder fault (SRF risk assessment was studied. This method may provide a novel solution for fault risk assessment of a vehicle or other general HME system.

  14. Acceptance and implementation of a system of planning computerized based on Monte Carlo; Aceptacion y puesta en marcha de un sistema de planificacion comutarizada basado en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-07-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  15. Abstractions for Fault-Tolerant Distributed System Verification

    Science.gov (United States)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  16. Rigorous Verification for the Solution of Nonlinear Interval System ...

    African Journals Online (AJOL)

    We survey a general method for solving nonlinear interval systems of equations. In particular, we paid special attention to the computational aspects of linear interval systems since the bulk of computations are done during the stage of computing outer estimation of the including linear interval systems. The height of our ...

  17. Experimental verification by means of thermoluminescent dosimetry of the distribution dose absorbed in water for a 137Cs Amersham CDCS-M-3 source, Monte Carlo simulated

    International Nuclear Information System (INIS)

    Fragoso Valdez, F. R.; Alvarez Romero, J. T.

    2001-01-01

    It verifies, in a experimental way, the Monte Carlo simulation results (PENELOPE algorithm) for the water absorbed dose distribution, imparted by a 1 37 Cs - Amersham source (model CDCS-M-3). The feigned results are expressed in terms of the functions Α(r,z), g(r) and F(r,Θ) according to the recommendations of the AAPM TG 43 [es

  18. Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border

    Science.gov (United States)

    Rijsdijk, M.; van Hinsbergh, W. H. M.; Witteveen, W.; ten Buuren, G. H. M.; Schakelaar, G. A.; Poppinga, G.; van Persie, M.; Ladiges, R.

    2013-08-01

    Quite often in the verification of cadastral borders, owners of the parcels involved are not able to make their attendance at the appointed moment in time. New appointments have to be made in order to complete the verification process, and as a result often costs and throughput times grow beyond what is considered to be acceptable. To improve the efficiency of the verification process an experiment was set up that refrains from the conventional terrestrial methods for border verification. The central research question was formulated as "How useful are Unmanned Aerial Systems in the juridical verification process of cadastral borders of ownership at het Kadaster in the Netherlands?". For the experiment, operational evaluations were executed at two different locations. The first operational evaluation took place at the Pyramid of Austerlitz, a flat area with a 30 m high pyramid built by troops of Napoleon, with low civilian attendance. Two subsequent evaluations were situated in a small neighbourhood in the city of Nunspeet, where the cadastral situation recently changed, resulting from twenty new houses that were build. Initially a mini-UAS of the KLPD was used to collect photo datasets with less than 1 cm spatial resolution. In a later stage the commercial service provider Orbit Gis was hired. During the experiment four different software packages were used for processing the photo datasets into accurate geo-referenced ortho-mosaics. In this article more details will be described on the experiments carried out. Attention will be paid to the mini-UAS platforms (AscTec Falcon 8, Microdrone MD-4), the cameras used, the photo collection plan, the usage of ground control markers and the calibration of the camera's. Furthermore the results and experiences of the different used SFM software packages (Visual SFM/Bundler, PhotoScan, PhotoModeler and the Orbit software) will be shared.

  19. Verification of discrete-event control systems using algebraic specification

    International Nuclear Information System (INIS)

    Vaeliwuo, H.; Sivertsen, T.

    1990-01-01

    In this paper power plant operating procedures and automatics are shown to be closely related to discrete-event systems, that is systems characterized by instantaneous discrete changes in their state. It is discussed how to model plant automatics, operating procedures and power plant systems as discrete-event systems so that the model can be used as a basis for formal proofs of various operational aspects. Algebraic specification is pressented as an appropriate modelling formalism. Proving theorems on safety and operationality of power plant systems controlled by discrete-event systems is then discussed. A theorem prover has been developed for our dialect of algebraic specification. This program is based on the close relationship between algebraic specifications and logic programs. The algebraic specifications are automatically translated to a set of Horn clauses, and hence represented as a PROLOG program. By using this representation it is possible to evaluate expressions from the corresponding algebraic specification

  20. Verification and Validation of Neural Networks for Aerospace Systems

    Science.gov (United States)

    Mackall, Dale; Nelson, Stacy; Schumann, Johann

    2002-01-01

    The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.

  1. Verification of authentication protocols for mobile satellite communication systems

    OpenAIRE

    Reham Abdellatif Abouhogail

    2014-01-01

    In recent times, many protocols have been proposed to provide security for mobile satellite communication systems. Such protocols must be tested for their functional correctness before they are used in practice. Many security protocols for the mobile satellite communication system have been presented. This paper analyzes three of the most famous authentication protocols for mobile satellite communication system from the security viewpoint of data desynchronization attack. Based on strand spac...

  2. Verification and validation of the safety parameter display system for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  3. Monte Carlo Analysis of the Accelerator-Driven System at Kyoto University Research Reactor Institute

    Directory of Open Access Journals (Sweden)

    Wonkyeong Kim

    2016-04-01

    Full Text Available An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan, a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft–Walton type accelerator, which generates the external neutron source by deuterium–tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.

  4. Seismic monitoring: a unified system for research and verifications

    International Nuclear Information System (INIS)

    Thigpen, L.

    1979-01-01

    A system for characterizing either a seismic source or geologic media from observational data was developed. This resulted from an examination of the forward and inverse problems of seismology. The system integrates many seismic monitoring research efforts into a single computational capability. Its main advantage is that it unifies computational and research efforts in seismic monitoring. 173 references, 9 figures, 3 tables

  5. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  6. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  7. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  8. Verification of authentication protocols for mobile satellite communication systems

    Directory of Open Access Journals (Sweden)

    Reham Abdellatif Abouhogail

    2014-12-01

    Full Text Available In recent times, many protocols have been proposed to provide security for mobile satellite communication systems. Such protocols must be tested for their functional correctness before they are used in practice. Many security protocols for the mobile satellite communication system have been presented. This paper analyzes three of the most famous authentication protocols for mobile satellite communication system from the security viewpoint of data desynchronization attack. Based on strand spaces testing model, data desynchronization attacks on these protocols were tested and analyzed. Furthermore, improvements to overcome the security vulnerabilities of two protocols are mentioned.

  9. MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Gabriela Ižaríková

    2015-12-01

    Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.

  10. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    Science.gov (United States)

    2003-03-01

    Informatics, Universitat de Les Illes Balears , France, July 1994. [11, 14] Knowledge Base Reduction Ginsberg, A., "Knowledge-Base Reduction: A New...Bayberry Street 4000 Central Florida Blvd. Oak Park CA 91301 Orlando FL 32816 3 Technische Universitat Ilmenau 98693 Ilmenau Germany 9. SPONSORING...P., Verification of Multi-Level Rule-Based Expert Systems, Ph.D. Dissertation, Universitat Politecnica de Catalunya, Spain, April 1992. [14] IRS-CBR

  11. 10451 Abstracts Collection -- Runtime Verification, Diagnosis, Planning and Control for Autonomous Systems

    OpenAIRE

    Havelund, Klaus; Leucker, Martin; Sachenbacher, Martin; Sokolsky, Oleg; Williams, Brian C.

    2011-01-01

    From November 7 to 12, 2010, the Dagstuhl Seminar 10451 ``Runtime Verification, Diagnosis, Planning and Control for Autonomous Systems'' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, 35 participants presented their current research and discussed ongoing work and open problems. This document puts together abstracts of the presentations given during the seminar, and provides links to extended abstracts or full papers, if available.

  12. Identity verification using voice and its use in a privacy preserving system

    OpenAIRE

    Çamlıkaya, Eren; Camlikaya, Eren

    2008-01-01

    Since security has been a growing concern in recent years, the field of biometrics has gained popularity and became an active research area. Beside new identity authentication and recognition methods, protection against theft of biometric data and potential privacy loss are current directions in biometric systems research. Biometric traits which are used for verification can be grouped into two: physical and behavioral traits. Physical traits such as fingerprints and iris patterns are charact...

  13. A Verification and Validation Tool for Diagnostic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  14. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... Our fundamental approach actively assists subject-matter experts in organizing their knowledge inclusive of uncertainty to build such embedded systems in a consistent and correct as well as effective fashion...

  15. A Verification and Validation Tool for Diagnostic Systems, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...

  16. Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Jiafeng Zhang

    2015-01-01

    Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.

  17. Acquisition System Verification for Energy Efficiency Analysis of Building Materials

    Directory of Open Access Journals (Sweden)

    Natalia Cid

    2017-08-01

    Full Text Available Climate change and fossil fuel depletion foster interest in improving energy efficiency in buildings. There are different methods to achieve improved efficiency; one of them is the use of additives, such as phase change materials (PCMs. To prove this method’s effectiveness, a building’s behaviour should be monitored and analysed. This paper describes an acquisition system developed for monitoring buildings based on Supervisory Control and Data Acquisition (SCADA and with a 1-wire bus network as the communication system. The system is empirically tested to prove that it works properly. With this purpose, two experimental cubicles are made of self-compacting concrete panels, one of which has a PCM as an additive to improve its energy storage properties. Both cubicles have the same dimensions and orientation, and they are separated by six feet to avoid shadows. The behaviour of the PCM was observed with the acquisition system, achieving results that illustrate the differences between the cubicles directly related to the PCM’s characteristics. Data collection devices included in the system were temperature sensors, some of which were embedded in the walls, as well as humidity sensors, heat flux density sensors, a weather station and energy counters. The analysis of the results shows agreement with previous studies of PCM addition; therefore, the acquisition system is suitable for this application.

  18. Monte-Carlo-simulation of 4π-detector systems

    International Nuclear Information System (INIS)

    Kunze, M.

    1986-08-01

    For experiments at LEAR (Low Energy Antiproton Ring) at CERN a detection system is proposed, which allows to measure charged and neutral annihilation products with nearly full solid angle. The main physic's items are the understanding of proton-antiproton annihilation on the quark-gluon level and new contributions to meson spectroscopy; furthermore, sensitive search for exotic states (glueballs, hybrids, baryonia, etc.) is possible. The neutral components are measured in a spherical modular CsI-detector (1500 modules), the charged particles are measured in a vertex chamber with high spatial resolution in a magnetic field. (orig.) [de

  19. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    Science.gov (United States)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  20. Accelerated Monte Carlo system reliability analysis through machine-learning-based surrogate models of network connectivity

    International Nuclear Information System (INIS)

    Stern, R.E.; Song, J.; Work, D.B.

    2017-01-01

    The two-terminal reliability problem in system reliability analysis is known to be computationally intractable for large infrastructure graphs. Monte Carlo techniques can estimate the probability of a disconnection between two points in a network by selecting a representative sample of network component failure realizations and determining the source-terminal connectivity of each realization. To reduce the runtime required for the Monte Carlo approximation, this article proposes an approximate framework in which the connectivity check of each sample is estimated using a machine-learning-based classifier. The framework is implemented using both a support vector machine (SVM) and a logistic regression based surrogate model. Numerical experiments are performed on the California gas distribution network using the epicenter and magnitude of the 1989 Loma Prieta earthquake as well as randomly-generated earthquakes. It is shown that the SVM and logistic regression surrogate models are able to predict network connectivity with accuracies of 99% for both methods, and are 1–2 orders of magnitude faster than using a Monte Carlo method with an exact connectivity check. - Highlights: • Surrogate models of network connectivity are developed by machine-learning algorithms. • Developed surrogate models can reduce the runtime required for Monte Carlo simulations. • Support vector machine and logistic regressions are employed to develop surrogate models. • Numerical example of California gas distribution network demonstrate the proposed approach. • The developed models have accuracies 99%, and are 1–2 orders of magnitude faster than MCS.

  1. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  2. System analysis approach to verification of site characterization parameters

    International Nuclear Information System (INIS)

    Romine, D.T.

    1987-01-01

    Early in the transition of the Basalt Waste Isolation Project (BWIP) from a preliminary geologic investigation to a part of a major system acquisition program, the following project needs were recognized: (1) site-specific system functional requirements, i.e., the capabilities a deep geologic basalt system must provide to ensure long-term isolation of wastes, (2) complete list of design variables and site characteristics (information and data needs) that could affect system capabilities; and (3) relative importance, availability, and uncertainty of these information and data needs. The first project need was satisfied by a conventional functional analysis. The second was answered by a unique extension of that functional analysis. The results of these two efforts have been released in the BWIP System Functional Analysis (SFA) Document. The third need is presently under study. With the advent of a formalized issue resolution strategy (IRS) process as the basis for the BWIP site characterization program, a subset of the SFA information and data needs was used to verify (a) that no significant variable was omitted from consideration in the IRS process, (b) the necessity of IRS site characterization parameters, and (c) the sufficiency of each issue-related set of IRS parameters to address that issue. An example of a SFA branch is discussed

  3. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    Science.gov (United States)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  4. Analytical Methods for Verification and Validation of Adaptive Systems in Safety-Critical Aerospace Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A major challenge of the use of adaptive systems in safety-critical applications is the software life-cycle: requirement engineering through verification and...

  5. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    Science.gov (United States)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  6. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    International Nuclear Information System (INIS)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  7. Verification of Security Policy Enforcement in Enterprise Systems

    Science.gov (United States)

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  8. Verification of Continuous Dynamical Systems by Timed Automata

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2011-01-01

    This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended......, which is generated utilizing sub-level sets of Lyapunov functions, as they are positive invariant sets. It is shown that this partition generates sound and complete abstractions. Furthermore, the complete abstractions can be composed of multiple timed automata, allowing parallelization...

  9. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  10. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  11. Formal Verification of the Danish Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan

    2014-01-01

    in the new Danish interlocking systems. Instantiating the generic model with interlocking configuration data results in a concrete model and high-level safety properties. Using bounded model checking and inductive reasoning, we are able to verify safety properties for model instances corresponding to railway...

  12. Specification styles in distributed systems design and verification

    NARCIS (Netherlands)

    Vissers, C.A.; Scollo, Giuseppe; van Sinderen, Marten J.; Brinksma, Hendrik

    1991-01-01

    Substantial experience with the use of formal specification languages in the design of distributed systems has shown that finding appropriate structures for formal specifications presents a serious, and often underestimated problem. Its solutions are of great importance for ensuring the quality of

  13. Verification of structures for utilization of waste multicomponent electrolytic systems

    Directory of Open Access Journals (Sweden)

    Suljkanović Midhat

    2008-01-01

    Full Text Available Determination of process structure of thermal utilization of mineral mat­ters from waste streams, is a multi-variant problem. These processes are energy-intensive and it is very important to determine the process structures for realization of the required processes in the starting phases of process development. The structure of the process system, beside the system equilibrium, depends on vector parameters of the feed stream. In this work a newly developed methodology for determination of process variants of thermal utilization of mineral salts from a hypothetical three-component AX-BX-H2O system is presented. The methodology is created on starting synthesis problem for which a set of types of process units for realization process and type of desired crystal product is determined. The methodology includes process decomposition in two subsystems: concentration (saturation subsystem and crystallization subsystem. Concentration of feed stream is realized in isothermal conditions of water evaporation and crystallization process using various techniques: isothermal water evaporation, cooling of solution in vacuum and cooling of solution through contact surface. Determination of physical feasible processes is performed by simulation of the process superstructure in which each particular process structure is a special case of the created process superstructure. Realization of mentioned activity is provided by creating algorithms and programming software (process simulator in which the equation system of the superstructure mathematical model is solved for various variants of set of specified variables. The created methodology and possibilities of the created process simulator are presented in the illustrative case study of waste stream utilization of the NaCl-KCl-H2O system. In addition to this, for conditions of total heat integration of subsystems is demonstrated that a small change of salt concentration of feed stream can require transfer non

  14. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  15. Coarsening in Solid Liquid Systems: A Verification of Fundamental Theory

    Science.gov (United States)

    Thompson, John D.

    Coarsening is a process that occurs in nearly all multi-phase materials in which the total energy of a system is reduced through the reduction of total interfacial energy. The theoretical description of this process is of central importance to materials design, yet remains controversial. In order to directly compare experiment to theoretical predictions, low solid volume fraction PbSn alloys were coarsened in a microgravity environment aboard the International Space Station (ISS) as part of the Coarsening in Solid Liquid Mixtures (CSLM) project. PbSn samples with solid volume fractions of 15%, 20% and 30% were characterized in 2D and 3D using mechanical serial sectioning. The systems were observed in the self-similar regime predicted by theory and the particle size and particle density obeyed the temporal power laws predicted by theory. However, the magnitudes of the rate constants governing those temporal laws as well as the forms of the particle size distributions were not described well by theoretical predictions. Additionally, in the 30% solid volume fraction system, the higher volume fraction results in a non-spherical particle shape and a more closely packed spatial distribution. The presence of slow particle motion induced by vibrations on the ISS is presented as an explanation for this discrepancy. To model the effect of this particle motion, the Akaiwa-Voorhees multiparticle diffusion simulations are modified to treat coarsening in the presence of a small convection term, such as that of sedimentation, corresponding to low Peclet numbers. The simulations indicate that the particle size dependent velocity of the sedimentation increases the rate at which the system coarsens. This is due to the larger particles traveling farther than normal, resulting in them encountering more small particles, which favors their growth. Additionally, sedimentation resulted in broader PSDs with a peak located at the average particle size. When the simulations are modified to

  16. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response. Addendum

    Science.gov (United States)

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Arlington, Virginia 22203 SERDP/ESTCP N/A Unlimited distribution This document highlights a more rigorous physics-based alternative to geophysical

  17. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response

    Science.gov (United States)

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Arlington, Virginia 22203 SERDP/ESTCP N/A Unlimited distribution This document highlights a more rigorous physics-based alternative to geophysical

  18. Simulated coal gas MCFC power plant system verification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-30

    The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.

  19. Integrated testing and verification system for research flight software design document

    Science.gov (United States)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  20. Behavioural Verification: Preventing Report Fraud in Decentralized Advert Distribution Systems

    Directory of Open Access Journals (Sweden)

    Stylianos S. Mamais

    2017-11-01

    Full Text Available Service commissions, which are claimed by Ad-Networks and Publishers, are susceptible to forgery as non-human operators are able to artificially create fictitious traffic on digital platforms for the purpose of committing financial fraud. This places a significant strain on Advertisers who have no effective means of differentiating fabricated Ad-Reports from those which correspond to real consumer activity. To address this problem, we contribute an advert reporting system which utilizes opportunistic networking and a blockchain-inspired construction in order to identify authentic Ad-Reports by determining whether they were composed by honest or dishonest users. What constitutes a user’s honesty for our system is the manner in which they access adverts on their mobile device. Dishonest users submit multiple reports over a short period of time while honest users behave as consumers who view adverts at a balanced pace while engaging in typical social activities such as purchasing goods online, moving through space and interacting with other users. We argue that it is hard for dishonest users to fake honest behaviour and we exploit the behavioural patterns of users in order to classify Ad-Reports as real or fabricated. By determining the honesty of the user who submitted a particular report, our system offers a more secure reward-claiming model which protects against fraud while still preserving the user’s anonymity.

  1. Requirements Verification Report AN Farm to 200E Waste Transfer System for Project W-314, Tank Farm Restoration and Safe Operations

    International Nuclear Information System (INIS)

    MCGREW, D.L.

    1999-01-01

    This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate

  2. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Directory of Open Access Journals (Sweden)

    Moon-Gi Min

    2017-08-01

    Full Text Available Distributed Control System (DCS communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs.

  3. Verification of failover effects from distributed control system communication networks in digitalized nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Min, Moon Gi; Lee, Jae Ki; Lee, Kwang Hyun; Lee, Dong Il; Lim, Hee Taek [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2017-08-15

    Distributed Control System (DCS) communication networks, which use Fast Ethernet with redundant networks for the transmission of information, have been installed in digitalized nuclear power plants. Normally, failover tests are performed to verify the reliability of redundant networks during design and manufacturing phases; however, systematic integrity tests of DCS networks cannot be fully performed during these phases because all relevant equipment is not installed completely during these two phases. In additions, practical verification tests are insufficient, and there is a need to test the actual failover function of DCS redundant networks in the target environment. The purpose of this study is to verify that the failover functions works correctly in certain abnormal conditions during installation and commissioning phase and identify the influence of network failover on the entire DCS. To quantify the effects of network failover in the DCS, the packets (Protocol Data Units) must be collected and resource usage of the system has to be monitored and analyzed. This study introduces the use of a new methodology for verification of DCS network failover during the installation and commissioning phases. This study is expected to provide insight into verification methodology and the failover effects from DCS redundant networks. It also provides test results of network performance from DCS network failover in digitalized domestic nuclear power plants (NPPs)

  4. The SAMS: Smartphone Addiction Management System and verification.

    Science.gov (United States)

    Lee, Heyoung; Ahn, Heejune; Choi, Samwook; Choi, Wanbok

    2014-01-01

    While the popularity of smartphones has given enormous convenience to our lives, their pathological use has created a new mental health concern among the community. Hence, intensive research is being conducted on the etiology and treatment of the condition. However, the traditional clinical approach based surveys and interviews has serious limitations: health professionals cannot perform continual assessment and intervention for the affected group and the subjectivity of assessment is questionable. To cope with these limitations, a comprehensive ICT (Information and Communications Technology) system called SAMS (Smartphone Addiction Management System) is developed for objective assessment and intervention. The SAMS system consists of an Android smartphone application and a web application server. The SAMS client monitors the user's application usage together with GPS location and Internet access location, and transmits the data to the SAMS server. The SAMS server stores the usage data and performs key statistical data analysis and usage intervention according to the clinicians' decision. To verify the reliability and efficacy of the developed system, a comparison study with survey-based screening with the K-SAS (Korean Smartphone Addiction Scale) as well as self-field trials is performed. The comparison study is done using usage data from 14 users who are 19 to 50 year old adults that left at least 1 week usage logs and completed the survey questionnaires. The field trial fully verified the accuracy of the time, location, and Internet access information in the usage measurement and the reliability of the system operation over more than 2 weeks. The comparison study showed that daily use count has a strong correlation with K-SAS scores, whereas daily use times do not strongly correlate for potentially addicted users. The correlation coefficients of count and times with total K-SAS score are CC = 0.62 and CC =0.07, respectively, and the t-test analysis for the

  5. Experiences with a prototype tracking and verification system implemented within an imaging center.

    Science.gov (United States)

    Guo, Bing; Documet, Jorge; Lee, Jasper; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H K; Grant, Edward G

    2007-03-01

    Most health care facilities currently struggle with protecting medical data privacy, misidentification of patients, and long patient waiting times. This article demonstrates a novel system for a clinical environment using wireless tracking and facial biometric technologies to automatically monitor and identify staff and patients to address these problems. The design of the location tracking and verification system (LTVS) was based on a workflow study which was performed to observe the physical location and movement of patient and staff at the Healthcare Consultation Center II (HCC II) running hospital information systems, radiology information systems, picture archive and communication systems, and a voice recognition system. Based on the results from this workflow study, the LTVS was designed using a wireless real-time location system and a facial biometric system integrated with the radiology information system. The LTVS was tested for its functionality in a laboratory environment, then evaluated at HCC II. Experimental results in the laboratory and clinical environments demonstrated that patient and staff real-time location information and identity verification can be obtained from LTVS. Warning messages can immediately be sent to alert staff when patient's waiting time is over a predefined limit, and unauthorized access to a security area can be audited. Additionally, patient misidentification can be prevented during the course of examinations. The system enabled health care providers to streamline the patient workflow, protect against erroneous examinations and create a security zone to prevent, and audit unauthorized access to patient health care data required by the Health Insurance Portability and Accountability Act mandate.

  6. Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Adina Aniculaesei

    2016-12-01

    Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.

  7. Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo

    International Nuclear Information System (INIS)

    Rodriguez Marrero, J. P.; Diaz Garcia, A.; Gomez Facenda, A.

    2015-01-01

    Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)

  8. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    Science.gov (United States)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  9. Burnup verification tests with the FORK measurement system-implementation for burnup credit

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. It was designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program and is well suited to verify burnup and cooling time records at commercial Pressurized Water Reactor (PWR) sites. This report deals with the application of the FORK system to burnup credit operations

  10. Validation, verification and evaluation of a Train to Train Distance Measurement System by means of Colored Petri Nets

    International Nuclear Information System (INIS)

    Song, Haifeng; Liu, Jieyu; Schnieder, Eckehard

    2017-01-01

    Validation, verification and evaluation are necessary processes to assure the safety and functionality of a system before its application in practice. This paper presents a Train to Train Distance Measurement System (TTDMS), which can provide distance information independently from existing onboard equipment. Afterwards, we proposed a new process using Colored Petri Nets to verify the TTDMS system functional safety, as well as to evaluate the system performance. Three main contributions are carried out in the paper: Firstly, this paper proposes a formalized TTDMS model, and the model correctness is validated using state space analysis and simulation-based verification. Secondly, corresponding checking queries are proposed for the purpose of functional safety verification. Further, the TTDMS performance is evaluated by applying parameters in the formal model. Thirdly, the reliability of a functional prototype TTDMS is estimated. It is found that the procedure can cooperate with the system development, and both formal and simulation-based verifications are performed. Using our process to evaluate and verify a system is easier to read and more reliable compared to executable code and mathematical methods. - Highlights: • A new Train to Train Distance Measurement System. • New approach verifying system functional safety and evaluating system performance by means of CPN. • System formalization using the system property concept. • Verification of system functional safety using state space analysis. • Evaluation of system performance applying simulation-based analysis.

  11. Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    Science.gov (United States)

    Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.

    2017-12-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.

  12. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M.

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  13. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  14. Verification of fast neutron spectrum calculation in coupled system HERBE

    International Nuclear Information System (INIS)

    Avdic, S.; Pesic, M.; Marinkovic, P.

    1995-01-01

    A high-resolution semiconductor spectrometer filled with 3 He gas, in diode coincidence arrangement, is applied to measure neutron spectrum in the centre of the fast core of the coupled fast-thermal system HERBE in the 'Vinca' Institute. The neutron spectrum is evaluated from measured pulse height distribution by using the HE3 computer code developed in the Nuclear Engineering Laboratory of the Institute of Nuclear Sciences VINCA. Experimental results are compared with the relevant multigroup calculations in the energy range from 2.5 MeV to 10.5 MeV. The measured spectrum provides a sufficient overlapping with the calculated one and no serious divergence is found in the measured energy range. (author)

  15. Robust control design verification using the modular modeling system

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ben-Abdennour, A.; Lee, K.Y.

    1991-01-01

    The Modular Modeling System (B ampersand W MMS) is being used as a design tool to verify robust controller designs for improving power plant performance while also providing fault-accommodating capabilities. These controllers are designed based on optimal control theory and are thus model based controllers which are targeted for implementation in a computer based digital control environment. The MMS is being successfully used to verify that the controllers are tolerant of uncertainties between the plant model employed in the controller and the actual plant; i.e., that they are robust. The two areas in which the MMS is being used for this purpose is in the design of (1) a reactor power controller with improved reactor temperature response, and (2) the design of a multiple input multiple output (MIMO) robust fault-accommodating controller for a deaerator level and pressure control problem

  16. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  17. Development and Verification of the Taiwan Ocean Prediction System

    Science.gov (United States)

    Liau, J. M.; Lai, J. W.; Yang, Y.; Chen, S. H.

    2016-02-01

    Taiwan is an island state surrounded by the sea to which the economic activity and the ecological environment are closely related. The aim of the Taiwan ocean prediction system is to satisfy the assessment of the marine environment and the reduction of risks in the coastal zone of Taiwan. The high-resolution wave and ocean numerical models, the data assimilation and the observing database are established and operated by Taiwan Ocean Research Institute (TORI). In order to reduce computational time and increase the grid resolution, the one-way nested grids are adopted for Northwestern Pacific Ocean and the sea around Taiwan. The WAVEWACTCH-III and the nested SWAN wave models and the multi-scales Princeton Ocean Model (POM) are respectively used for the predictions of the wind wave and the three-dimensional ocean current. A joint effect composed of the tide and the ocean circulation is also taken into consideration for resolving the complex current field around Taiwan. The performance evaluations are well carried out to investigate the rationality of the numerical models by using the observed data, such as temperature and current velocity profile, wave height and wave period of data buoys and sea surface current of the TORI high frequency radar observing system. The well verified results of the numerical ocean model assimilated by the sea surface height are used to investigate characteristics of Kuroshio in the waters off eastern Taiwan, and several value-added modules, such as storm surge simulation and particle tracking, are also developed to apply to the maritime casualty search and rescue, the hazard mitigation and the disaster assistance.

  18. Fuzzy Controllers for a Gantry Crane System with Experimental Verifications

    Directory of Open Access Journals (Sweden)

    Naif B. Almutairi

    2016-01-01

    Full Text Available The control problem of gantry cranes has attracted the attention of many researchers because of the various applications of these cranes in the industry. In this paper we propose two fuzzy controllers to control the position of the cart of a gantry crane while suppressing the swing angle of the payload. Firstly, we propose a dual PD fuzzy controller where the parameters of each PD controller change as the cart moves toward its desired position, while maintaining a small swing angle of the payload. This controller uses two fuzzy subsystems. Then, we propose a fuzzy controller which is based on heuristics. The rules of this controller are obtained taking into account the knowledge of an experienced crane operator. This controller is unique in that it uses only one fuzzy system to achieve the control objective. The validity of the designed controllers is tested through extensive MATLAB simulations as well as experimental results on a laboratory gantry crane apparatus. The simulation results as well as the experimental results indicate that the proposed fuzzy controllers work well. Moreover, the simulation and the experimental results demonstrate the robustness of the proposed control schemes against output disturbances as well as against uncertainty in some of the parameters of the crane.

  19. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  20. Verification of the CNGS timing system using fast diamond detectors

    Science.gov (United States)

    Jansen, H.; Alvarez Sanchez, P.; Pedersen, S. Bart; Dehning, B.; Dobos, D.; Effinger, E.; Ferrari, A.; Griesmayer, E.; Gschwendtner, E.; Kozsar, I.; Missiaen, D.; Pernegger, H.; Sala, P. R.; Serrano, J.; Ward, C.

    2013-01-01

    A new fast diagnostic tool was installed in the CNGS facility in 2011 following the neutrino time-of-flight results published by OPERA in September 2011. Among others, four polycrystalline CVD (pCVD) diamond detectors were placed in the secondary beam line about 1200 m downstream of the CNGS target in order to measure the beam structure of the muons which are produced together with the muon neutrinos. Upstream of the CNGS target, a fast beam current transformer measures the proton beam structure. The sub-nanosecond single-pulse time resolution of pCVD diamond for a minimum ionising particle in combination with a GPS system allows the measurement of the GPS timing of individual secondary particle bunches crossing these detectors with a precision of < 1 ns. The complicated structure of the CNGS muon beam in 2011 necessitates the combination of adjacent bunches in order to compare the proton beam structure with the muon beam structure. An analysis of the detector signals was carried out, which provides an independent timing measurement at CERN with a precision of 1.2 ns. Uncertainties from other sources as cable lengths add up to 3.4 ns, resulting in an overall precision of 3.6 ns. The distance between the beam current transformer and the diamond detectors has been measured to (1859.95±0.02) cm. The nominal time-of-flight of (6205.3±1.7) ns for a 17 GeV/c muon, as present in the CNGS muons beam, falls within the uncertainties of the measured time-of-flight of (6205.2±3.6) ns. Hence, the GPS timing measurements performed at CERN are consistent.

  1. Nondestructive verification and assay systems for spent fuels. Technical appendixes

    Energy Technology Data Exchange (ETDEWEB)

    Cobb, D.D.; Phillips, J.R.; Baker, M.P.

    1982-04-01

    Six technical appendixes are presented that provide important supporting technical information for the study of the application of nondestructive measurements to spent-fuel storage. Each appendix addresses a particular technical subject in a reasonably self-contained fashion. Appendix A is a comparison of spent-fuel data predicted by reactor operators with measured data from reprocessors. This comparison indicates a rather high level of uncertainty in previous burnup calculations. Appendix B describes a series of nondestructive measurements at the GE-Morris Operation Spent-Fuel Storage Facility. This series of experiments successfully demonstrated a technique for reproducible positioning of fuel assemblies for nondestructive measurement. The experimental results indicate the importance of measuring the axial and angular burnup profiles of irradiated fuel assemblies for quantitative determination of spent-fuel parameters. Appendix C is a reasonably comprehensive bibliography of reports and symposia papers on spent-fuel nondestructive measurements to April 1981. Appendix D is a compendium of spent-fuel calculations that includes isotope production and depletion calculations using the EPRI-CINDER code, calculations of neutron and gamma-ray source terms, and correlations of these sources with burnup and plutonium content. Appendix E describes the pulsed-neutron technique and its potential application to spent-fuel measurements. Although not yet developed, the technique holds the promise of providing separate measurements of the uranium and plutonium fissile isotopes. Appendix F describes the experimental program and facilities at Los Alamos for the development of spent-fuel nondestructive measurement systems. Measurements are reported showing that the active neutron method is sensitive to the replacement of a single fuel rod with a dummy rod in an unirradiated uranium fuel assembly.

  2. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  3. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    International Nuclear Information System (INIS)

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-01-01

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  4. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Energy Technology Data Exchange (ETDEWEB)

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji [Department of Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2016-04-15

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  5. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  6. Verification of operational weather forecasts from the POSEIDON system across the Eastern Mediterranean

    Directory of Open Access Journals (Sweden)

    A. Papadopoulos

    2009-07-01

    Full Text Available The POSEIDON weather forecasting system became operational at the Hellenic Centre for Marine Research (HCMR in October 1999. The system with its nesting capability provided 72-h forecasts in two different model domains, i.e. 25- and 10-km grid spacing. The lower-resolution domain covered an extended area that included most of Europe, Mediterranean Sea and N. Africa, while the higher resolution domain focused on the Eastern Mediterranean. A major upgrade of the system was recently implemented in the framework of the POSEIDON-II project (2005–2008. The aim was to enhance the forecasting skill of the system through improved model parameterization schemes and advanced numerical techniques for assimilating available observations to produce high resolution analysis fields. The configuration of the new system is applied on a horizontal resolution of 1/20°×1/20° (~5 km covering the Mediterranean basin, Black Sea and part of North Atlantic providing up to 5-day forecasts. This paper reviews and compares the current with the previous weather forecasting systems at HCMR presenting quantitative verification statistics from the pre-operational period (from mid-November 2007 to October 2008. The statistics are based on verification against surface observations from the World Meteorological Organization (WMO network across the Eastern Mediterranean region. The results indicate that the use of the new system can significantly improve the weather forecasts.

  7. Preparation of a program for the independent verification of the brachytherapy planning systems calculations

    International Nuclear Information System (INIS)

    V Carmona, V.; Perez-Calatayud, J.; Lliso, F.; Richart Sancho, J.; Ballester, F.; Pujades-Claumarchirant, M.C.; Munoz, M.

    2010-01-01

    In this work a program is presented that independently checks for each patient the treatment planning system calculations in low dose rate, high dose rate and pulsed dose rate brachytherapy. The treatment planning system output text files are automatically loaded in this program in order to get the source coordinates, the desired calculation point coordinates and the dwell times when it is the case. The source strength and the reference dates are introduced by the user. The program allows implementing the recommendations about independent verification of the clinical brachytherapy dosimetry in a simple and accurate way, in few minutes. (Author).

  8. Verification of secure distributed systems in higher order logic: A modular approach using generic components

    Energy Technology Data Exchange (ETDEWEB)

    Alves-Foss, J.; Levitt, K.

    1991-01-01

    In this paper we present a generalization of McCullough's restrictiveness model as the basis for proving security properties about distributed system designs. We mechanize this generalization and an event-based model of computer systems in the HOL (Higher Order Logic) system to prove the composability of the model and several other properties about the model. We then develop a set of generalized classes of system components and show for which families of user views they satisfied the model. Using these classes we develop a collection of general system components that are instantiations of one of these classes and show that the instantiations also satisfied the security property. We then conclude with a sample distributed secure system, based on the Rushby and Randell distributed system design and designed using our collection of components, and show how our mechanized verification system can be used to verify such designs. 16 refs., 20 figs.

  9. Monte Carlo approach to the decay rate of a metastable system with an arbitrarily shaped barrier

    International Nuclear Information System (INIS)

    Bao, Jing-Dong; Bi, Lei; Jia, Ying

    2007-01-01

    A path integral Monte Carlo method based on the fast-Fourier transform technique combined with the important sampling method is proposed to calculate the decay rate of a metastable quantum system with an arbitrary shape of a potential barrier. The contribution of all fluctuation actions is included which can be used to check the accuracy of the usual steepest-descent approximation, namely, the perturbation expansion of potential. The analytical approximation is found to produce the decay rate of a particle in a cubic potential being about 20% larger than the Monte Carlo data at the crossover temperature. This disagreement increases with increasing complexity of the potential shape. We also demonstrate via Langevin simulation that the postsaddle potential influences strongly upon the classical escape rate

  10. Improving the efficiency of Monte Carlo simulations of systems that undergo temperature-driven phase transitions

    Science.gov (United States)

    Velazquez, L.; Castro-Palacio, J. C.

    2013-07-01

    Recently, Velazquez and Curilef proposed a methodology to extend Monte Carlo algorithms based on a canonical ensemble which aims to overcome slow sampling problems associated with temperature-driven discontinuous phase transitions. We show in this work that Monte Carlo algorithms extended with this methodology also exhibit a remarkable efficiency near a critical point. Our study is performed for the particular case of a two-dimensional four-state Potts model on a square lattice with periodic boundary conditions. This analysis reveals that the extended version of Metropolis importance sampling is more efficient than the usual Swendsen-Wang and Wolff cluster algorithms. These results demonstrate the effectiveness of this methodology to improve the efficiency of MC simulations of systems that undergo any type of temperature-driven phase transition.

  11. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  12. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  13. Characterizing a Proton Beam Scanning System for Monte Carlo Dose Calculation in Patients

    Science.gov (United States)

    Grassberger, C; Lomax, Tony; Paganetti, H

    2015-01-01

    The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low–energy electrons (protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations. PMID:25549079

  14. Comparison of criticality benchmark evaluations for U+Pu system. JACS code system and the other Monte Carlo codes

    International Nuclear Information System (INIS)

    Takada, Tomoyuki; Yoshiyama, Hiroshi; Miyoshi, Yoshinori; Katakura, Jun-ichi

    2003-01-01

    Criticality safety evaluation code system JACS was developed by JAERI. Its accuracy evaluation was performed in 1980's. Although the evaluation of JACS was performed for various critical systems, the comparisons with continuous energy Monte Carlo code were not performed because such code was not developed those days. The comparisons are presented in this paper about the heterogeneous and homogeneous system containing U+Pu nitrate solutions. (author)

  15. Practical requirements for software tools to assist in the validation and verification of hybrid expert systems

    International Nuclear Information System (INIS)

    Singh, G.P.; Cadena, D.; Burgess, J.

    1992-01-01

    Any practical software development effort must remain focused on verification and validation of user requirements. Knowledge-based system development is no different in this regard. In industry today, most expert systems being produced are, in reality, hybrid software systems which, in addition to those components that provide the knowledge base and expert reasoning over the problem domain using various rule-based and object-oriented paradigms, incorporate significant bodies of code based on more traditional software techniques such as database management, graphical user interfaces, hypermedia, spreadsheets, as well as specially developed sequential code. Validation and verification of such hybrid systems must perforce integrate suitable methodologies from all such fields. This paper attempts to provide a broad overview of the practical requirements for methodologies and the concomitant groupware tools which would assist in such an enterprise. These methodologies and groupware tools would facilitate the teamwork efforts necessary to validate and verify all components of such hybrid systems by emphasizing cooperative recording of requirements and negotiated resolutions of any conflicts grounded in a solid understanding of the semantics of such a system

  16. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  17. Dynamic Calibration and Verification Device of Measurement System for Dynamic Characteristic Coefficients of Sliding Bearing

    Science.gov (United States)

    Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang

    2016-01-01

    The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283

  18. Coarse-grained Monte Carlo simulations of non-equilibrium systems.

    Science.gov (United States)

    Liu, Xiao; Crocker, John C; Sinno, Talid

    2013-06-28

    We extend the scope of a recent method for generating coarse-grained lattice Metropolis Monte Carlo simulations [X. Liu, W. D. Seider, and T. Sinno, Phys. Rev. E 86, 026708 (2012); and J. Chem. Phys. 138, 114104 (2013)] from continuous interaction potentials to non-equilibrium situations. The original method has been shown to satisfy detailed balance at the coarse scale and to provide a good representation of various equilibrium properties in both atomic and molecular systems. However, we show here that the original method is inconsistent with non-equilibrium trajectories generated by full-resolution Monte Carlo simulations, which, under certain conditions, have been shown to correspond to Langevin dynamics. The modified coarse-grained method is generated by simultaneously biasing the forward and backward transition probability for every possible move, thereby preserving the detailed balance of the original method. The resulting coarse-grained Monte Carlo simulations are shown to provide trajectories that are consistent with overdamped Langevin (Smoluchowski) dynamics using a sequence of simple non-equilibrium examples. We first consider the purely diffusional spreading of a Gaussian pulse of ideal-gas particles and then include an external potential to study the influence of drift. Finally, we validate the method using a more general situation in which the particles interact via a Lennard-Jones interparticle potential.

  19. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    International Nuclear Information System (INIS)

    Penchev, Petar; Maeder, Ulf; Fiebich, Martin; Zink, Klemens; University Hospital Marburg

    2015-01-01

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  20. Management of Information in Radiation Oncology: An Integrated System for Scheduling, Treatment, Billing, and Verification.

    Science.gov (United States)

    Herman; Williams; Dicello

    1997-01-01

    An effective information system is an essential prerequisite to delivering quality patient care at competitive costs. From scheduling and billing to complex treatment machine control and verification, the quality of the information system strongly affects the efficiency and accuracy with which patient care is delivered. The standard paper-based information system used in many clinics suffers form inefficiencies and incompleteness in scheduling and billing, no centralized database and the inability to generate routine reports and communicate with other information systems. Many of these problems are resolved by the introduction of an electronic information system. The implementation, gains, and limitations of two electronic information systems are discussed. While limitations such as the lack of complete seamless integration of all information still exist, major improvements have been made in efficiency, accuracy, data integrity, and reporting and billing completeness.

  1. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures

    Science.gov (United States)

    2009-11-24

    IV&V Independent Verification and Validation JTRS Joint Tactical Radio System JVMF Joint Variable Message Format LDAP Lightweight Directory Access...Protocol LDIF LDAP Data Interchange Format LSI Lead Systems Integrator LUT Limited User Test MCS Mounted Combat System / Mobility Computer System

  2. Dose calculations for a simplified Mammosite system with the Monte Carlo Penelope and MCNPX simulation codes

    International Nuclear Information System (INIS)

    Rojas C, E.L.; Varon T, C.F.; Pedraza N, R.

    2007-01-01

    The treatment of the breast cancer at early stages is of vital importance. For that, most of the investigations are dedicated to the early detection of the suffering and their treatment. As investigation consequence and clinical practice, in 2002 it was developed in U.S.A. an irradiation system of high dose rate known as Mammosite. In this work we carry out dose calculations for a simplified Mammosite system with the Monte Carlo Penelope simulation code and MCNPX, varying the concentration of the contrast material that it is used in the one. (Author)

  3. Dosimetric verification in water of a Monte Carlo treatment planning tool for proton, helium, carbon and oxygen ion beams at the Heidelberg Ion Beam Therapy Center

    Science.gov (United States)

    Tessonnier, T.; Böhlen, T. T.; Ceruti, F.; Ferrari, A.; Sala, P.; Brons, S.; Haberer, T.; Debus, J.; Parodi, K.; Mairani, A.

    2017-08-01

    The introduction of ‘new’ ion species in particle therapy needs to be supported by a thorough assessment of their dosimetric properties and by treatment planning comparisons with clinically used proton and carbon ion beams. In addition to the latter two ions, helium and oxygen ion beams are foreseen at the Heidelberg Ion Beam Therapy Center (HIT) as potential assets for improving clinical outcomes in the near future. We present in this study a dosimetric validation of a FLUKA-based Monte Carlo treatment planning tool (MCTP) for protons, helium, carbon and oxygen ions for spread-out Bragg peaks in water. The comparisons between the ions show the dosimetric advantages of helium and heavier ion beams in terms of their distal and lateral fall-offs with respect to protons, reducing the lateral size of the region receiving 50% of the planned dose up to 12 mm. However, carbon and oxygen ions showed significant doses beyond the target due to the higher fragmentation tail compared to lighter ions (p and He), up to 25%. The Monte Carlo predictions were found to be in excellent geometrical agreement with the measurements, with deviations below 1 mm for all parameters investigated such as target and lateral size as well as distal fall-offs. Measured and simulated absolute dose values agreed within about 2.5% on the overall dose distributions. The MCTP tool, which supports the usage of multiple state-of-the-art relative biological effectiveness models, will provide a solid engine for treatment planning comparisons at HIT.

  4. Dosimetric verification in water of a Monte Carlo treatment planning tool for proton, helium, carbon and oxygen ion beams at the Heidelberg Ion Beam Therapy Center.

    Science.gov (United States)

    Tessonnier, T; Böhlen, T T; Ceruti, F; Ferrari, A; Sala, P; Brons, S; Haberer, T; Debus, J; Parodi, K; Mairani, A

    2017-07-31

    The introduction of 'new' ion species in particle therapy needs to be supported by a thorough assessment of their dosimetric properties and by treatment planning comparisons with clinically used proton and carbon ion beams. In addition to the latter two ions, helium and oxygen ion beams are foreseen at the Heidelberg Ion Beam Therapy Center (HIT) as potential assets for improving clinical outcomes in the near future. We present in this study a dosimetric validation of a FLUKA-based Monte Carlo treatment planning tool (MCTP) for protons, helium, carbon and oxygen ions for spread-out Bragg peaks in water. The comparisons between the ions show the dosimetric advantages of helium and heavier ion beams in terms of their distal and lateral fall-offs with respect to protons, reducing the lateral size of the region receiving 50% of the planned dose up to 12 mm. However, carbon and oxygen ions showed significant doses beyond the target due to the higher fragmentation tail compared to lighter ions (p and He), up to 25%. The Monte Carlo predictions were found to be in excellent geometrical agreement with the measurements, with deviations below 1 mm for all parameters investigated such as target and lateral size as well as distal fall-offs. Measured and simulated absolute dose values agreed within about 2.5% on the overall dose distributions. The MCTP tool, which supports the usage of multiple state-of-the-art relative biological effectiveness models, will provide a solid engine for treatment planning comparisons at HIT.

  5. Monte Carlo analysis of a control technique for a tunable white lighting system

    DEFF Research Database (Denmark)

    Chakrabarti, Maumita; Thorseth, Anders; Jepsen, Jørgen

    2017-01-01

    A simulated colour control mechanism for a multi-coloured LED lighting system is presented. The system achieves adjustable and stable white light output and allows for system-to-system reproducibility after application of the control mechanism. The control unit works using a pre-calibrated lookup...... peak wavelength, the LED rated luminous flux bin, the influence of the operating conditions, ambient temperature, driving current, and the spectral response of the colour sensor. The system performance is investigated by evaluating the outputs from the Monte Carlo simulation. The outputs show...... that the applied control system yields an uncertainty on the luminous flux of 2.5% within a 95% coverage interval which is a significant reduction from the 8% of the uncontrolled system. A corresponding uncertainty reduction in Δu´v´ is achieved from an average of 0.0193 to 0.00125 within 95% coverage range after...

  6. Experimental study on design verification of new concept for integral reactor safety system

    International Nuclear Information System (INIS)

    Chung, Moon Ki; Choi, Ki Yong; Park, Hyun Sik; Cho, Seok; Park, Choon Kyung; Lee, Sung Jae; Song, Chul Hwa

    2004-01-01

    The pressurized light water cooled, medium power (330 MWt) SMART (System-integrated Modular Advanced ReacTor) has been under development at KAERI for a dual purpose : seawater desalination and electricity generation. The SMART design verification phase was followed to conduct various separate effects tests and comprehensive integral effect tests. The high temperature / high pressure thermal-hydraulic test facility, VISTA(Experimental Verification by Integral Simulation of Transient and Accidents) has been constructed to simulate the SMART-P (the one fifth scaled pilot plant) by KAERI. Experimental tests have been performed to investigate the thermal-hydraulic dynamic characteristics of the primary and the secondary systems. Heat transfer characteristics and natural circulation performance of the PRHRS (Passive Residual Heat Removal System) of SMART-P were also investigated using the VISTA facility. The coolant flows steadily in the natural circulation loop which is composed of the Steam Generator (SG) primary side, the secondary system, and the PRHRS. The heat transfers through the PRHRS heat exchanger and ECT are sufficient enough to enable the natural circulation of the coolant

  7. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...... possible automated verification of large industrial designs with the use of only modest resources (less than 5 minutes on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool visualSTATETM....

  8. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models....... This technique makes possible automated verification of large industrial designs with the use of only modest resources (less than one hour on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool \\visualstate....

  9. Flexible prototype of modular multilevel converters for experimental verification of DC transmission and multiterminal systems

    DEFF Research Database (Denmark)

    Konstantinou, Georgios; Ceballos, Salvador; Gabiola, Igor

    2017-01-01

    Testing and verification of high-level and low-level control, modulation, fault handling and converter co-ordination for modular multilevel converters (MMCs) requires development of experimental prototype converters. In this paper, we provide a a complete overview of the MMC-based experimental...... prototype at UNSW Sydney (The University of New South Wales) including the structure of the sub-modules, communication, control and protection functions as well as the possible configurations of the system. The prototype, rated at a dc voltage of up to 800 V and power of 20 kVA and can be used to study...

  10. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    Science.gov (United States)

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  11. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  12. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    Science.gov (United States)

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  13. Verification of improved patient outcomes with a partially implantable hearing aid, The SOUNDTEC direct hearing system.

    Science.gov (United States)

    Roland, P S; Shoup, A G; Shea, M C; Richey, H S; Jones, D B

    2001-10-01

    Partially implantable hearing devices have been developed to address some of the user-perceived shortcomings of standard amplification systems. Partially implantable devices are purported to provide improved sound quality as a result of decreased occlusion, decreased feedback, and enhanced clarity resulting from increased high-frequency gain. Such improvements may result in greater user satisfaction. To justify selection of a partially implantable device and undergoing a minor surgical procedure, verification techniques must be used to document user improvement or increased satisfaction over conventional amplification. To evaluate patient satisfaction with the SOUNDTEC direct hearing system. Within-subjects repeated measures design. Objective and subjective evaluation pre- and post-implantation with the SOUNDTEC device. Verification techniques included tonal functional gain measures with traditional amplification and the SOUNDTEC device, word recognition in quiet (NU-6) and in noise (SPIN), the Abbreviated Profile of Hearing Aid Benefit (APHAB), and the Hough Ear Institute Profile (HEIP). Although there was no significant difference between optimal traditional amplification and the SOUNDTEC device for speech perception measures, the SOUNDTEC device yielded statistically significant increased high-frequency functional gain. Subjective reports indicated that the SOUNDTEC device provides a cleaner, more natural sound without feedback than traditional amplification. Partially implantable hearing aids may address some of the limitations of traditional amplification systems.

  14. CARMEN: a system Monte Carlo based on linear programming from direct openings; CARMEN: Un sistema de planficiacion Monte Carlo basado en programacion lineal a partir de aberturas directas

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-07-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  15. Formal specification and verification of interactive systems with plasticity: Applications to nuclear-plant supervision

    International Nuclear Information System (INIS)

    Oliveira, Raquel Araujo de

    2015-01-01

    The advent of ubiquitous computing and the increasing variety of platforms and devices change user expectations in terms of user interfaces. Systems should be able to adapt themselves to their context of use, i.e., the platform (e.g. a PC or a tablet), the users who interact with the system (e.g. administrators or regular users), and the environment in which the system executes (e.g. a dark room or outdoor). The capacity of a UI to withstand variations in its context of use while preserving usability is called plasticity. Plasticity provides users with different versions of a UI. Although it enhances UI capabilities, plasticity adds complexity to the development of user interfaces: the consistency between multiple versions of a given UI should be ensured. Given the large number of possible versions of a UI, it is time-consuming and error prone to check these requirements by hand. Some automation must be provided to verify plasticity.This complexity is further increased when it comes to UIs of safety-critical systems. Safety-critical systems are systems in which a failure has severe consequences. The complexity of such systems is reflected in the UIs, which are now expected not only to provide correct, intuitive, non-ambiguous and adaptable means for users to accomplish a goal, but also to cope with safety requirements aiming to make sure that systems are reasonably safe before they enter the market. Several techniques to ensure quality of systems in general exist, which can also be used to safety-critical systems. Formal verification provides a rigorous way to perform verification, which is suitable for safety-critical systems. Our contribution is an approach to verify safety-critical interactive systems provided with plastic UIs using formal methods. Using a powerful tool-support, our approach permits:-The verification of sets of properties over a model of the system. Using model checking, our approach permits the verification of properties over the system formal

  16. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  17. CAD-based Monte Carlo Program for Integrated Simulation of Nuclear System SuperMC

    Science.gov (United States)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2014-06-01

    Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as routine method for nuclear design and analysis in the future. High fidelity simulation with MC method coupled with multi-physical phenomenon simulation has significant impact on safety, economy and sustainability of nuclear systems. However, great challenges to current MC methods and codes prevent its application in real engineering project. SuperMC is a CAD-based Monte Carlo program for integrated simulation of nuclear system developed by FDS Team, China, making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC were presented in this paper. SuperMC2.1, the latest version for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. SuperMC is still in its evolution process toward a general and routine tool for nuclear system. Warning, no authors found for 2014snam.conf06023.

  18. A multi-transputer system for parallel Monte Carlo simulations of extensive air showers

    International Nuclear Information System (INIS)

    Gils, H.J.; Heck, D.; Oehlschlaeger, J.; Schatz, G.; Thouw, T.

    1989-01-01

    A multiprocessor computer system has been brought into operation at the Kernforschungszentrum Karlsruhe. It is dedicated to Monte Carlo simulations of extensive air showers induced by ultra-high energy cosmic rays. The architecture consists of two independently working VMEbus systems each with a 68020 microprocessor as host computer and twelve T800 transputers for parallel processing. The two systems are linked via Ethernet for data exchange. The T800 transputers are equipped with 4 Mbyte RAM each, sufficient to run rather large codes. The host computers are operated under UNIX 5.3. On the transputers compilers for PARALLEL FORTRAN, C, and PASCAL are available. The simple modular architecture of this parallel computer reflects the single purpose for which it is intended. The hardware of the multiprocessor computer is described as well as the way how the user software is handled and distributed to the 24 working processors. The performance of the parallel computer is demonstrated by well-known benchmarks and by realistic Monte Carlo simulations of air showers. Comparisons with other types of microprocessors and with large universal computers are made. It is demonstrated that a cost reduction by more than a factor of 20 is achieved by this system as compared to universal computer. (orig.)

  19. SU-E-J-60: Efficient Monte Carlo Dose Calculation On CPU-GPU Heterogeneous Systems

    International Nuclear Information System (INIS)

    Xiao, K; Chen, D. Z; Hu, X. S; Zhou, B

    2014-01-01

    Purpose: It is well-known that the performance of GPU-based Monte Carlo dose calculation implementations is bounded by memory bandwidth. One major cause of this bottleneck is the random memory writing patterns in dose deposition, which leads to several memory efficiency issues on GPU such as un-coalesced writing and atomic operations. We propose a new method to alleviate such issues on CPU-GPU heterogeneous systems, which achieves overall performance improvement for Monte Carlo dose calculation. Methods: Dose deposition is to accumulate dose into the voxels of a dose volume along the trajectories of radiation rays. Our idea is to partition this procedure into the following three steps, which are fine-tuned for CPU or GPU: (1) each GPU thread writes dose results with location information to a buffer on GPU memory, which achieves fully-coalesced and atomic-free memory transactions; (2) the dose results in the buffer are transferred to CPU memory; (3) the dose volume is constructed from the dose buffer on CPU. We organize the processing of all radiation rays into streams. Since the steps within a stream use different hardware resources (i.e., GPU, DMA, CPU), we can overlap the execution of these steps for different streams by pipelining. Results: We evaluated our method using a Monte Carlo Convolution Superposition (MCCS) program and tested our implementation for various clinical cases on a heterogeneous system containing an Intel i7 quad-core CPU and an NVIDIA TITAN GPU. Comparing with a straightforward MCCS implementation on the same system (using both CPU and GPU for radiation ray tracing), our method gained 2-5X speedup without losing dose calculation accuracy. Conclusion: The results show that our new method improves the effective memory bandwidth and overall performance for MCCS on the CPU-GPU systems. Our proposed method can also be applied to accelerate other Monte Carlo dose calculation approaches. This research was supported in part by NSF under Grants CCF

  20. Conserved directed percolation: exact quasistationary distribution of small systems and Monte Carlo simulations

    Science.gov (United States)

    César Mansur Filho, Júlio; Dickman, Ronald

    2011-05-01

    We study symmetric sleepy random walkers, a model exhibiting an absorbing-state phase transition in the conserved directed percolation (CDP) universality class. Unlike most examples of this class studied previously, this model possesses a continuously variable control parameter, facilitating analysis of critical properties. We study the model using two complementary approaches: analysis of the numerically exact quasistationary (QS) probability distribution on rings of up to 22 sites, and Monte Carlo simulation of systems of up to 32 000 sites. The resulting estimates for critical exponents β, \\beta /\

  1. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  2. River Protection Project Integrated safety management system phase II verification report, volumes I and II - 8/19/99

    Energy Technology Data Exchange (ETDEWEB)

    SHOOP, D.S.

    1999-09-10

    The Department of Energy policy (DOE P 450.4) is that safety is integrated into all aspects of the management and operations of its facilities. In simple and straightforward terms, the Department will ''Do work safely.'' The purpose of this River Protection Project (RPP) Integrated Safety Management System (ISMS) Phase II Verification was to determine whether ISMS programs and processes are implemented within RFP to accomplish the goal of ''Do work safely.'' The goal of an implemented ISMS is to have a single integrated system that includes Environment, Safety, and Health (ES&H) requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and federal property over the RPP life cycle. The ISMS is comprised of the (1) described functions, components, processes, and interfaces (system map or blueprint) and (2) personnel who are executing those assigned roles and responsibilities to manage and control the ISMS. Therefore, this review evaluated both the ''paper'' and ''people'' aspects of the ISMS to ensure that the system is implemented within RPP. Richland Operations Office (RL) conducted an ISMS Phase I Verification of the TWRS from September 28-October 9, 1998. The resulting verification report recommended that TWRS-RL and the contractor proceed with Phase II of ISMS verification given that the concerns identified from the Phase I verification review are incorporated into the Phase II implementation plan.

  3. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  4. Verification of a Real Time Weather Forecasting System in Southern Italy

    Directory of Open Access Journals (Sweden)

    Luca Tiriolo

    2015-01-01

    Full Text Available This paper shows the performance of an operational forecasting system, based on the regional atmospheric modeling system (RAMS, at 3 km horizontal resolution over southern Italy. The model is initialized from the 12 UTC operational analysis/forecasting cycle of the European Centre for Medium range Weather Forecasts (ECMWF. The forecast is issued for the following three days. The performance is evaluated for a whole year for the surface parameters: temperature, relative humidity, wind speed and direction, and precipitation. The verification has been performed against SYNOP stations over southern Italy. A dense non-GTS network over Calabria is used for precipitation. Results show that RMSE is about 2-3 K for temperature, 12–16% for relative humidity, 2.0–2.8 m/s for wind speed, and 55–75° for wind direction, the performance varying with the season and with the forecasting time. The error increases between the first and third forecast days. The verification of the rainfall forecast shows that the model underestimates the area of the precipitation. The model output statistics (MOS is applied to all parameters but precipitation. Results show that the MOS reduces the RMSE by 0–30%, depending on the forecasting time, on the season and on the meteorological parameter.

  5. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  6. Penelope-2006: a code system for Monte Carlo simulation of electron and photon transport

    International Nuclear Information System (INIS)

    2006-01-01

    The computer code system PENELOPE (version 2006) performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV to about 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A geometry package called PENGEOM permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the PENELOPE code system, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. These proceedings contain the corresponding manual and teaching notes of the PENELOPE-2006 workshop and training course, held on 4-7 July 2006 in Barcelona, Spain. (author)

  7. Experimental verification of mathematical model of the heat transfer in exhaust system

    Directory of Open Access Journals (Sweden)

    Petković Snežana

    2011-01-01

    Full Text Available A Catalyst convertor has maximal efficiency when it reaches working temperature. In a cold start phase efficiency of the catalyst is low and exhaust emissions have high level of air pollutants. The exhaust system optimization, in order to decrease time of achievement of the catalyst working temperature, caused reduction of the total vehicle emission. Implementation of mathematical models in development of exhaust systems decrease total costs and reduce time. Mathematical model has to be experimentally verified and calibrated, in order to be useful in the optimization process. Measurement installations have been developed and used for verification of the mathematical model of unsteady heat transfer in exhaust systems. Comparisons between experimental results and the mathematical model are presented in this paper. Based on obtained results, it can be concluded that there is a good agreement between the model and the experimental results.

  8. Issues of verification and validation of application-specific integrated circuits in reactor trip systems

    International Nuclear Information System (INIS)

    Battle, R.E.; Alley, G.T.

    1993-01-01

    Concepts of using application-specific integrated circuits (ASICs) in nuclear reactor safety systems are evaluated. The motivation for this evaluation stems from the difficulty of proving that software-based protection systems are adequately reliable. Important issues concerning the reliability of computers and software are identified and used to evaluate features of ASICS. These concepts indicate that ASICs have several advantages over software for simple systems. The primary advantage of ASICs over software is that verification and validation (V ampersand V) of ASICs can be done with much higher confidence than can be done with software. A method of performing this V ampersand V on ASICS is being developed at Oak Ridge National Laboratory. The purpose of the method's being developed is to help eliminate design and fabrication errors. It will not solve problems with incorrect requirements or specifications

  9. Research on MRV system of iron and steel industry and verification mechanism establishment in China

    Science.gov (United States)

    Guo, Huiting; Chen, Liang; Chen, Jianhua

    2017-12-01

    The national carbon emissions trading market will be launched in 2017 in China. The iron and steel industry will be covered as one of the first industries. Establishing its MRV system is critical to promote the development of the iron and steel industry in the carbon trading market. This paper studies the requirements and procedures of the accounting, monitoring, reporting and verification of the seven iron and steel industry carbon trading pilots. The construction and operating mechanism of the MRV systems are also analyzed. Combining with the emission feature of the iron and steel industry, we study the suitable national MRV system for the whole iron and steel industry to consummate the future national carbon trading framework of iron and steel industry.

  10. The grout/glass performance assessment code system (GPACS) with verification and benchmarking

    International Nuclear Information System (INIS)

    Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.

    1994-12-01

    GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user's guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway

  11. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    International Nuclear Information System (INIS)

    Mori, Takamasa; Nakagawa, Masayuki; Kaneko, Kunio.

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author)

  12. Neutron cross section library production code system for continuous energy Monte Carlo code MVP. LICEM

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Takamasa; Nakagawa, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-05-01

    A code system has been developed to produce neutron cross section libraries for the MVP continuous energy Monte Carlo code from an evaluated nuclear data library in the ENDF format. The code system consists of 9 computer codes, and can process nuclear data in the latest ENDF-6 format. By using the present system, MVP neutron cross section libraries for important nuclides in reactor core analyses, shielding and fusion neutronics calculations have been prepared from JENDL-3.1, JENDL-3.2, JENDL-FUSION file and ENDF/B-VI data bases. This report describes the format of MVP neutron cross section library, the details of each code in the code system and how to use them. (author).

  13. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  14. Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code

    International Nuclear Information System (INIS)

    Merheb, C; Petegnief, Y; Talbot, J N

    2007-01-01

    Positron emission tomography (PET) systems dedicated to animal imaging are now widely used for biological studies. The scanner performance strongly depends on the design and the characteristics of the system. Many parameters must be optimized like the dimensions and type of crystals, geometry and field-of-view (FOV), sampling, electronics, lightguide, shielding, etc. Monte Carlo modelling is a powerful tool to study the effect of each of these parameters on the basis of realistic simulated data. Performance assessment in terms of spatial resolution, count rates, scatter fraction and sensitivity is an important prerequisite before the model can be used instead of real data for a reliable description of the system response function or for optimization of reconstruction algorithms. The aim of this study is to model the performance of the Philips Mosaic(TM) animal PET system using a comprehensive PET simulation code in order to understand and describe the origin of important factors that influence image quality. We use GATE, a Monte Carlo simulation toolkit for a realistic description of the ring PET model, the detectors, shielding, cap, electronic processing and dead times. We incorporate new features to adjust signal processing to the Anger logic underlying the Mosaic(TM) system. Special attention was paid to dead time and energy spectra descriptions. Sorting of simulated events in a list mode format similar to the system outputs was developed to compare experimental and simulated sensitivity and scatter fractions for different energy thresholds using various models of phantoms describing rat and mouse geometries. Count rates were compared for both cylindrical homogeneous phantoms. Simulated spatial resolution was fitted to experimental data for 18 F point sources at different locations within the FOV with an analytical blurring function for electronic processing effects. Simulated and measured sensitivities differed by less than 3%, while scatter fractions agreed

  15. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  16. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  17. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, Fabian F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yu, Yi-Hsiang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nielsen, Kim [Ramboll, Copenhagen (Denmark); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bunnik, Tim [MARIN (Netherlands); Touzon, Imanol [Tecnalia (Spain); Nam, Bo Woo [KRISO (Korea, Rep. of); Kim, Jeong Seok [KRISO (Korea, Rep. of); Janson, Carl Erik [Chalmers University (Sweden); Jakobsen, Ken-Robert [EDRMedeso (Norway); Crowley, Sarah [WavEC (Portugal); Vega, Luis [Hawaii Natural Energy Institute (United States); Rajagopalan, Krishnakimar [Hawaii Natural Energy Institute (United States); Mathai, Thomas [Glosten (United States); Greaves, Deborah [Plymouth University (United Kingdom); Ransley, Edward [Plymouth University (United Kingdom); Lamont-Kane, Paul [Queen' s University Belfast (United Kingdom); Sheng, Wanan [University College Cork (Ireland); Costello, Ronan [Wave Venture (United Kingdom); Kennedy, Ben [Wave Venture (United Kingdom); Thomas, Sarah [Floating Power Plant (Denmark); Heras, Pilar [Floating Power Plant (Denmark); Bingham, Harry [Technical University of Denmark (Denmark); Kurniawan, Adi [Aalborg University (Denmark); Kramer, Morten Mejlhede [Aalborg University (Denmark); Ogden, David [INNOSEA (France); Girardin, Samuel [INNOSEA (France); Babarit, Aurelien [EC Nantes (France); Wuillaume, Pierre-Yves [EC Nantes (France); Steinke, Dean [Dynamic Systems Analysis (Canada); Roy, Andre [Dynamic Systems Analysis (Canada); Beatty, Scott [Cascadia Coast Research (Canada); Schofield, Paul [ANSYS (United States); Kim, Kyong-Hwan [KRISO (Korea, Rep. of); Jansson, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden); BCAM (Spain); Hoffman, Johan [KTH Royal Inst. of Technology, Stockholm (Sweden)

    2017-10-16

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30) [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.

  18. Verification of absorbed dose calculation with XIO Radiotherapy Treatment Planning System

    International Nuclear Information System (INIS)

    Bokulic, T.; Budanec, M.; Frobe, A.; Gregov, M.; Kusic, Z.; Mlinaric, M.; Mrcela, I.

    2013-01-01

    Modern radiotherapy relies on computerized treatment planning systems (TPS) for absorbed dose calculation. Most TPS require a detailed model of a given machine and therapy beams. International Atomic Energy Agency (IAEA) recommends acceptance testing for the TPS (IAEA-TECDOC-1540). In this study we present customization of those tests for measurements with the purpose of verification of beam models intended for clinical use in our department. Elekta Synergy S linear accelerator installation and data acquisition for Elekta CMS XiO 4.62 TPS was finished in 2011. After the completion of beam modelling in TPS, tests were conducted in accordance with the IAEA protocol for TPS dose calculation verification. The deviations between the measured and calculated dose were recorded for 854 points and 11 groups of tests in a homogenous phantom. Most of the deviations were within tolerance. Similar to previously published results, results for irregular L shaped field and asymmetric wedged fields were out of tolerance for certain groups of points.(author)

  19. Verification and Validation Challenges for Adaptive Flight Control of Complex Autonomous Systems

    Science.gov (United States)

    Nguyen, Nhan T.

    2018-01-01

    Autonomy of aerospace systems requires the ability for flight control systems to be able to adapt to complex uncertain dynamic environment. In spite of the five decades of research in adaptive control, the fact still remains that currently no adaptive control system has ever been deployed on any safety-critical or human-rated production systems such as passenger transport aircraft. The problem lies in the difficulty with the certification of adaptive control systems since existing certification methods cannot readily be used for nonlinear adaptive control systems. Research to address the notion of metrics for adaptive control began to appear in the recent years. These metrics, if accepted, could pave a path towards certification that would potentially lead to the adoption of adaptive control as a future control technology for safety-critical and human-rated production systems. Development of certifiable adaptive control systems represents a major challenge to overcome. Adaptive control systems with learning algorithms will never become part of the future unless it can be proven that they are highly safe and reliable. Rigorous methods for adaptive control software verification and validation must therefore be developed to ensure that adaptive control system software failures will not occur, to verify that the adaptive control system functions as required, to eliminate unintended functionality, and to demonstrate that certification requirements imposed by regulatory bodies such as the Federal Aviation Administration (FAA) can be satisfied. This presentation will discuss some of the technical issues with adaptive flight control and related V&V challenges.

  20. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE — BAYSAVER TECHNOLOGIES, INC. BAYSAVER SEPARATION SYSTEM, MODEL 10K

    Science.gov (United States)

    Verification testing of the BaySaver Separation System, Model 10K was conducted on a 10 acre drainage basin near downtown Griffin, Georgia. The system consists of two water tight pre-cast concrete manholes and a high-density polyethylene BaySaver Separator Unit. The BaySaver Mod...

  2. Monte Carlo calculations on the magnetization profile and domain wall structure in bulk systems and nanoconstricitons

    Energy Technology Data Exchange (ETDEWEB)

    Serena, P. A. [Instituto de Ciencias de Materiales de Madrid, Madrid (Spain); Costa-Kraemer, J. L. [Instituto de Microelectronica de Madrid, Madrid (Spain)

    2001-03-01

    A Monte Carlo algorithm suitable to study systems described by an anisotropic Heisenberg Hamiltonian is presented. This technique has been tested successfully with 3D and 2D systems, illustrating how magnetic properties depend on the dimensionality and the coordination number. We have found that magnetic properties of constrictions differ from those appearing in bulk. In particular, spin fluctuations are considerable larger than those calculated for bulk materials. In addition, domain walls are strongly modified when a constriction is present, with a decrease of the domain-wall width. This decrease is explained in terms of previous theoretical works. [Spanish] Se presenta un algoritmo de Monte Carlo para estudiar sistemas discritos por un hamiltoniano anisotropico de Heisenburg. Esta tecnica ha sido probada exitosamente con sistemas de dos y tres dimensiones, ilustrado con las propiedades magneticas dependen de la dimensionalidad y el numero de coordinacion. Hemos encontrado que las propiedades magneticas de constricciones difieren de aquellas del bulto. En particular, las fluctuaciones de espin son considerablemente mayores. Ademas, las paredes de dominio son fuertemente modificadas cuando una construccion esta presente, originando un decrecimiento del ancho de la pared de dominio. Damos cuenta de este decrecimiento en terminos de un trabajo teorico previo.

  3. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  4. IMRT head and neck treatment planning with a commercially available Monte Carlo based planning system

    International Nuclear Information System (INIS)

    Boudreau, C; Heath, E; Seuntjens, J; Ballivy, O; Parker, W

    2005-01-01

    The PEREGRINE Monte Carlo dose-calculation system (North American Scientific, Cranberry Township, PA) is the first commercially available Monte Carlo dose-calculation code intended specifically for intensity modulated radiotherapy (IMRT) treatment planning and quality assurance. In order to assess the impact of Monte Carlo based dose calculations for IMRT clinical cases, dose distributions for 11 head and neck patients were evaluated using both PEREGRINE and the CORVUS (North American Scientific, Cranberry Township, PA) finite size pencil beam (FSPB) algorithm with equivalent path-length (EPL) inhomogeneity correction. For the target volumes, PEREGRINE calculations predict, on average, a less than 2% difference in the calculated mean and maximum doses to the gross tumour volume (GTV) and clinical target volume (CTV). An average 16% ± 4% and 12% ± 2% reduction in the volume covered by the prescription isodose line was observed for the GTV and CTV, respectively. Overall, no significant differences were noted in the doses to the mandible and spinal cord. For the parotid glands, PEREGRINE predicted a 6% ± 1% increase in the volume of tissue receiving a dose greater than 25 Gy and an increase of 4% ± 1% in the mean dose. Similar results were noted for the brainstem where PEREGRINE predicted a 6% ± 2% increase in the mean dose. The observed differences between the PEREGRINE and CORVUS calculated dose distributions are attributed to secondary electron fluence perturbations, which are not modelled by the EPL correction, issues of organ outlining, particularly in the vicinity of air cavities, and differences in dose reporting (dose to water versus dose to tissue type)

  5. Commissioning of a Monte Carlo treatment planning system for clinical use in radiation therapy; Evaluacion de un sistema de planificacion Monte Carlo de uso clinico para radioterapia

    Energy Technology Data Exchange (ETDEWEB)

    Zucca Aparcio, D.; Perez Moreno, J. M.; Fernandez Leton, P.; Garcia Ruiz-Zorrila, J.

    2016-10-01

    The commissioning procedures of a Monte Carlo treatment planning system (MC) for photon beams from a dedicated stereotactic body radiosurgery (SBRT) unit has been reported in this document. XVMC has been the MC Code available in the treatment planning system evaluated (BrainLAB iPlan RT Dose) which is based on Virtual Source Models that simulate the primary and scattered radiation, besides the electronic contamination, using gaussian components for whose modelling are required measurements of dose profiles, percentage depth dose and output factors, performed both in water and in air. The dosimetric accuracy of the particle transport simulation has been analyzed by validating the calculations in homogeneous and heterogeneous media versus measurements made under the same conditions as the dose calculation, and checking the stochastic behaviour of Monte Carlo calculations when using different statistical variances. Likewise, it has been verified how the planning system performs the conversion from dose to medium to dose to water, applying the stopping power ratio water to medium, in the presence of heterogeneities where this phenomenon is relevant, such as high density media (cortical bone). (Author)

  6. SAFTAC, Monte-Carlo Fault Tree Simulation for System Design Performance and Optimization

    International Nuclear Information System (INIS)

    Crosetti, P.A.; Garcia de Viedma, L.

    1976-01-01

    1 - Description of problem or function: SAFTAC is a Monte Carlo fault tree simulation program that provides a systematic approach for analyzing system design, performing trade-off studies, and optimizing system changes or additions. 2 - Method of solution: SAFTAC assumes an exponential failure distribution for basic input events and a choice of either Gaussian distributed or constant repair times. The program views the system represented by the fault tree as a statistical assembly of independent basic input events, each characterized by an exponential failure distribution and, if used, a constant or normal repair distribution. 3 - Restrictions on the complexity of the problem: The program is dimensioned to handle 1100 basic input events and 1100 logical gates. It can be re-dimensioned to handle up to 2000 basic input events and 2000 logical gates within the existing core memory

  7. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    Science.gov (United States)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  8. Current status of the verification and processing system GALILÉE-1 for evaluated data

    Science.gov (United States)

    Coste-Delclaux, Mireille; Jouanne, Cédric; Moreau, Frédéric; Mounier, Claude; Visonneau, Thierry; Dolci, Florence

    2017-09-01

    This paper describes the current status of GALILÉE-1 that is the new verification and processing system for evaluated data, developed at CEA. It consists of various components respectively dedicated to read/write the evaluated data whatever the format is, to diagnose inconsistencies in the evaluated data and to provide continuous-energy and multigroup data as well as probability tables for transport and depletion codes. All these components are written in C+ + language and share the same objects. Cross-comparisons with other processing systems (NJOY, CALENDF or PREPRO) are systematically carried out at each step in order to fully master possible discrepancies. Some results of such comparisons are provided.

  9. Method and practice on safety software verification and validation for digital reactor protection system

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2010-01-01

    The key issue arising from digitalization of reactor protection system for Nuclear Power Plant (NPP) is in essence, how to carry out Verification and Validation (V and V), to demonstrate and confirm the software is reliable enough to perform reactor safety functions. Among others the most important activity of software V and V process is unit testing. This paper discusses the basic concepts on safety software V and V and the appropriate technique for software unit testing, focusing on such aspects as how to ensure test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed herein was successfully used in the work of unit testing on safety software of a digital reactor protection system. (author)

  10. Specification and Verification of Distributed Embedded Systems: A Traffic Intersection Product Family

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.

  11. BrachyView, A novel inbody imaging system for HDR prostate brachytherapy: Design and Monte Carlo feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Safavi-Naeini, M.; Han, Z.; Cutajar, D.; Guatelli, S.; Petasecca, M.; Lerch, M. L. F. [Centre for Medical Radiation Physics, University of Wollongong, Wollongong, NSW 2522 (Australia); Franklin, D. R. [Faculty of Engineering and Information Technology, University of Technology, Sydney, NSW 2007 (Australia); Jakubek, J.; Pospisil, S. [Institute of Experimental and Applied Physics (IEAP), Czech Technical University in Prague (CTU) (Czech Republic); Bucci, J.; Zaider, M.; Rosenfeld, A. B. [St. George Hospital Cancer Care Centre, Gray Street, Kogarah, NSW 2217 (Australia); Memorial Sloan Kettering Cancer Center, 1275 York Avenue, New York, New York 10065 (United States); Centre for Medical Radiation Physics, University of Wollongong, Wollongong, NSW 2522 (Australia)

    2013-07-15

    Purpose: High dose rate (HDR) brachytherapy is a form of radiation therapy for treating prostate cancer whereby a high activity radiation source is moved between predefined positions inside applicators inserted within the treatment volume. Accurate positioning of the source is essential in delivering the desired dose to the target area while avoiding radiation injury to the surrounding tissue. In this paper, HDR BrachyView, a novel inbody dosimetric imaging system for real time monitoring and verification of the radioactive seed position in HDR prostate brachytherapy treatment is introduced. The current prototype consists of a 15 Multiplication-Sign 60 mm{sup 2} silicon pixel detector with a multipinhole tungsten collimator placed 6.5 mm above the detector. Seven identical pinholes allow full imaging coverage of the entire treatment volume. The combined pinhole and pixel sensor arrangement is geometrically designed to be able to resolve the three-dimensional location of the source. The probe may be rotated to keep the whole prostate within the transverse plane. The purpose of this paper is to demonstrate the efficacy of the design through computer simulation, and to estimate the accuracy in resolving the source position (in detector plane and in 3D space) as part of the feasibility study for the BrachyView project.Methods: Monte Carlo simulations were performed using the GEANT4 radiation transport model, with a {sup 192}Ir source placed in different locations within a prostate phantom. A geometrically accurate model of the detector and collimator were constructed. Simulations were conducted with a single pinhole to evaluate the pinhole design and the signal to background ratio obtained. Second, a pair of adjacent pinholes were simulated to evaluate the error in calculated source location.Results: Simulation results show that accurate determination of the true source position is easily obtainable within the typical one second source dwell time. The maximum error in

  12. BrachyView, a novel inbody imaging system for HDR prostate brachytherapy: design and Monte Carlo feasibility study.

    Science.gov (United States)

    Safavi-Naeini, M; Han, Z; Cutajar, D; Guatelli, S; Petasecca, M; Lerch, M L F; Franklin, D R; Jakubek, J; Pospisil, S; Bucci, J; Zaider, M; Rosenfeld, A B

    2013-07-01

    High dose rate (HDR) brachytherapy is a form of radiation therapy for treating prostate cancer whereby a high activity radiation source is moved between predefined positions inside applicators inserted within the treatment volume. Accurate positioning of the source is essential in delivering the desired dose to the target area while avoiding radiation injury to the surrounding tissue. In this paper, HDR BrachyView, a novel inbody dosimetric imaging system for real time monitoring and verification of the radioactive seed position in HDR prostate brachytherapy treatment is introduced. The current prototype consists of a 15 × 60 mm(2) silicon pixel detector with a multipinhole tungsten collimator placed 6.5 mm above the detector. Seven identical pinholes allow full imaging coverage of the entire treatment volume. The combined pinhole and pixel sensor arrangement is geometrically designed to be able to resolve the three-dimensional location of the source. The probe may be rotated to keep the whole prostate within the transverse plane. The purpose of this paper is to demonstrate the efficacy of the design through computer simulation, and to estimate the accuracy in resolving the source position (in detector plane and in 3D space) as part of the feasibility study for the BrachyView project. Monte Carlo simulations were performed using the GEANT4 radiation transport model, with a (192)Ir source placed in different locations within a prostate phantom. A geometrically accurate model of the detector and collimator were constructed. Simulations were conducted with a single pinhole to evaluate the pinhole design and the signal to background ratio obtained. Second, a pair of adjacent pinholes were simulated to evaluate the error in calculated source location. Simulation results show that accurate determination of the true source position is easily obtainable within the typical one second source dwell time. The maximum error in the estimated projection position was found to be

  13. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F [UniversityKansas Medical Center, Kansas City, KS (United States)

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  14. A study of Monte Carlo methods for weak approximations of stochastic particle systems in the mean-field?

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-08

    I discuss using single level and multilevel Monte Carlo methods to compute quantities of interests of a stochastic particle system in the mean-field. In this context, the stochastic particles follow a coupled system of Ito stochastic differential equations (SDEs). Moreover, this stochastic particle system converges to a stochastic mean-field limit as the number of particles tends to infinity. I start by recalling the results of applying different versions of Multilevel Monte Carlo (MLMC) for particle systems, both with respect to time steps and the number of particles and using a partitioning estimator. Next, I expand on these results by proposing the use of our recent Multi-index Monte Carlo method to obtain improved convergence rates.

  15. Managing the Verification Trajectory

    NARCIS (Netherlands)

    Ruys, T.C.; Brinksma, Hendrik

    In this paper we take a closer look at the automated analysis of designs, in particular of verification by model checking. Model checking tools are increasingly being used for the verification of real-life systems in an industrial context. In addition to ongoing research aimed at curbing the

  16. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  17. Practical experience with a local verification system for containment and surveillance sensors

    International Nuclear Information System (INIS)

    Lauppe, W.D.; Richter, B.; Stein, G.

    1984-01-01

    With the growing number of nuclear facilities and a number of large commercial bulk handling facilities steadily coming into operation the International Atomic Energy Agency is faced with increasing requirements as to reducing its inspection efforts. One means of meeting these requirements will be to deploy facility based remote interrogation methods for its containment and surveillance instrumentation. Such a technical concept of remote interrogation was realized through the so-called LOVER system development, a local verification system for electronic safeguards seal systems. In the present investigations the application was extended to radiation monitoring by introducing an electronic interface between the electronic safeguards seal and the neutron detector electronics of a waste monitoring system. The paper discusses the safeguards motivation and background, the experimental setup of the safeguards system and the performance characteristics of this LOVER system. First conclusions can be drawn from the performance results with respect to the applicability in international safeguards. This comprises in particular the definition of design specifications for an integrated remote interrogation system for various types of containment and surveillance instruments and the specifications of safeguards applications employing such a system

  18. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  19. Space applications of the MITS electron-photon Monte Carlo transport code system

    International Nuclear Information System (INIS)

    Kensek, R.P.; Lorence, L.J.; Halbleib, J.A.; Morel, J.E.

    1996-01-01

    The MITS multigroup/continuous-energy electron-photon Monte Carlo transport code system has matured to the point that it is capable of addressing more realistic three-dimensional adjoint applications. It is first employed to efficiently predict point doses as a function of source energy for simple three-dimensional experimental geometries exposed to simulated uniform isotropic planar sources of monoenergetic electrons up to 4.0 MeV. Results are in very good agreement with experimental data. It is then used to efficiently simulate dose to a detector in a subsystem of a GPS satellite due to its natural electron environment, employing a relatively complex model of the satellite. The capability for survivability analysis of space systems is demonstrated, and results are obtained with and without variance reduction

  20. Absolute calibration of in vivo measurement systems using magnetic resonance imaging and Monte Carlo computations

    International Nuclear Information System (INIS)

    Mallett, M.W.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is currently investigating a new method for obtaining absolute calibration factors for radiation measurement systems used to measure internally deposited radionuclides in vivo. This method uses magnetic resonance imaging (MRI) to determine the anatomical makeup of an individual. A new MRI technique is also employed that is capable of resolving the fat and water content of the human tissue. This anatomical and biochemical information is used to model a mathematical phantom. Monte Carlo methods are then used to simulate the transport of radiation throughout the phantom. By modeling the detection equipment of the in vivo measurement system into the code, calibration factors are generated that are specific to the individual. Furthermore, this method eliminates the need for surrogate human structures in the calibration process. A demonstration of the proposed method is being performed using a fat/water matrix

  1. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    International Nuclear Information System (INIS)

    Saha, Krishnendu; Straus, Kenneth J.; Glick, Stephen J.; Chen, Yu.

    2014-01-01

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction

  2. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Krishnendu [Ohio Medical Physics Consulting, Dublin, Ohio 43017 (United States); Straus, Kenneth J.; Glick, Stephen J. [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States); Chen, Yu. [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States)

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  3. Speaker verification system using acoustic data and non-acoustic data

    Science.gov (United States)

    Gable, Todd J [Walnut Creek, CA; Ng, Lawrence C [Danville, CA; Holzrichter, John F [Berkeley, CA; Burnett, Greg C [Livermore, CA

    2006-03-21

    A method and system for speech characterization. One embodiment includes a method for speaker verification which includes collecting data from a speaker, wherein the data comprises acoustic data and non-acoustic data. The data is used to generate a template that includes a first set of "template" parameters. The method further includes receiving a real-time identity claim from a claimant, and using acoustic data and non-acoustic data from the identity claim to generate a second set of parameters. The method further includes comparing the first set of parameters to the set of parameters to determine whether the claimant is the speaker. The first set of parameters and the second set of parameters include at least one purely non-acoustic parameter, including a non-acoustic glottal shape parameter derived from averaging multiple glottal cycle waveforms.

  4. System design verification of a hybrid geothermal/coal fired power plant

    Energy Technology Data Exchange (ETDEWEB)

    1978-09-01

    This hybrid plant utilizes geothermal fluid for feedwater heating. With respect to the extraction of available work from the geothermal fluids, this cycle is approximately two times as efficient as the all geothermal plant. The System Design Verification Study presented verifies the technical and economic feasibility of the hybrid plant. This report is comprised of a conceptual design, cost estimate, and economic analysis of a one-unit 715 MW hybrid geothermal/coal fired power plant. In addition to the use of geothermal fluid for feedwater heating, its use is also investigated for additional power generation, condensate and cooling tower makeup water, coal beneficiation, air preheating, flue gas reheating and plant space heating requirements. An engineering and construction schedule for the hybrid plant is also included.

  5. Verification and implications of the multiple pin treatment in the SASSYS-1 LMR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1994-01-01

    As part of a program to obtain realistic, as opposed to excessively conservative, analysis of reactor transients, a multiple pin treatment for the analysis of intra-subassembly thermal hydraulics has been included in the SASSYS-1 liquid metal reactor systems analysis code. This new treatment has made possible a whole new level of verification for the code. The code can now predict the steady-state and transient responses of individual thermocouples within instrumented subassemlies in a reactor, rather than just predicting average temperatures for a subassembly. Very good agreement has been achieved between code predictions and the experimental measurements of steady-state and transient temperatures and flow rates in the Shutdown Heat Removal Tests in the EBR-II Reactor. Detailed multiple pin calculations for blanket subassemblies in the EBR-II reactor demonstrate that the actual steady-state and transient peak temperatures in these subassemblies are significantly lower than those that would be calculated by simpler models

  6. Verification of Strength of the Welded Joints by using of the Aramis Video System

    Directory of Open Access Journals (Sweden)

    Pała Tadeusz

    2017-03-01

    Full Text Available In the paper are presented the results of strength analysis for the two types of the welded joints made according to conventional and laser technologies of high-strength steel S960QC. The hardness distributions, tensile properties and fracture toughness were determined for the weld material and heat affect zone material for both types of the welded joints. Tests results shown on advantage the laser welded joints in comparison to the convention ones. Tensile properties and fracture toughness in all areas of the laser joints have a higher level than in the conventional one. The heat affect zone of the conventional welded joints is a weakness area, where the tensile properties are lower in comparison to the base material. Verification of the tensile tests, which carried out by using the Aramis video system, confirmed this assumption. The highest level of strains was observed in HAZ material and the destruction process occurred also in HAZ of the conventional welded joint.

  7. Tank waste remediation system FSAR hazard identification/facility configuration verification report

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, D.P., Westinghouse Hanford

    1996-05-01

    This document provides the results of the Tank Waste Remediation System Final Safety Analysis Report (TWRS FSAR) hazards identification/facility configuration activities undertaken from the period of March 7, 1996 to May 31, 1996. The purpose of this activity was to provide an independent overview of the TWRS facility specific hazards and configurations that were used in support of the TWRS FSAR hazards and accident analysis development. It was based on a review of existing published documentation and field inspections. The objective of the verification effort was to provide a `snap shot` in time of the existing TWRS facility hazards and configurations and will be used to support hazards and accident analysis activities.

  8. ECG-based PICC tip verification system: an evaluation 5 years on.

    Science.gov (United States)

    Oliver, Gemma; Jones, Matt

    2016-10-27

    In 2011, the vascular access team at East Kent Hospitals University NHS Foundation Trust safely and successfully incorporated the use of electrocardiogram (ECG) guidance technology for verification of peripherally inserted central catheters (PICC) tip placement into their practice. This study, 5 years on, compared the strengths and limitations of using this ECG method with the previous gold-standard of post-procedural chest X-ray. The study was undertaken using an embedded case study approach, and the cost, accuracy and efficiency of both systems were evaluated and compared. Using ECG to confirm PICC tip position was found to be cheaper, quicker and more accurate than post-procedural chest X-ray.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT-A AND A ENVIRONMENTAL SEALS, INC., SEAL ASSIST SYSTEM (SAS) PHASE II REPORT

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of Seal Assist System (SAS) for natural gas reciprocating compressor rod packing manufactured by A&A Environmental Seals, Inc. The SAS uses a secondary containment gland to collect natural g...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: RESIDENTIAL ELECTRIC POWER GENERATION USING THE PLUG POWER SU1 FUEL CELL SYSTEM

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Plug Power SU1 Fuel Cell System manufactured by Plug Power. The SU1 is a proton exchange membrane fuel cell that requires hydrogen (H2) as fuel. H2 is generally not available, so the ...

  11. Video-based cargo fire verification system with fuzzy inference engine for commercial aircraft

    Science.gov (United States)

    Sadok, Mokhtar; Zakrzewski, Radek; Zeliff, Bob

    2005-02-01

    Conventional smoke detection systems currently installed onboard aircraft are often subject to high rates of false alarms. Under current procedures, whenever an alarm is issued the pilot is obliged to release fire extinguishers and to divert to the nearest airport. Aircraft diversions are costly and dangerous in some situations. A reliable detection system that minimizes false-alarm rate and allows continuous monitoring of cargo compartments is highly desirable. A video-based system has been recently developed by Goodrich Corporation to address this problem. The Cargo Fire Verification System (CFVS) is a multi camera system designed to provide live stream video to the cockpit crew and to perform hotspot, fire, and smoke detection in aircraft cargo bays. In addition to video frames, the CFVS uses other sensor readings to discriminate between genuine events such as fire or smoke and nuisance alarms such as fog or dust. A Mamdani-type fuzzy inference engine is developed to provide approximate reasoning for decision making. In one implementation, Gaussian membership functions for frame intensity-based features, relative humidity, and temperature are constructed using experimental data to form the system inference engine. The CFVS performed better than conventional aircraft smoke detectors in all standardized tests.

  12. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    Science.gov (United States)

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  13. Requirements verification and validation of operating system software for a PLC-based plant protection system prototype

    Energy Technology Data Exchange (ETDEWEB)

    Cha, K. H.; Lee, Y. J.; Cheon, S. W.; Son, H. S.; Kim, J. Y.; Lee, J. S.; Kwon, K. C. [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    This paper describes Requirements Verification and Validation(V and V) of operating system(OS) software to be developed for Programmable Logic Controller(PLC)-based digital Plant Protection System(PPS) prototype in Korea Nuclear Instrumentation and Control System (KNICS) project. The OS is being developed as newly developing software, lifecycle V and V is applied, and software V and V criteria and requirements in the Software Review Plan (SRP)/BTP-14, the IEEE Std. 7-4.3.2, the IEEE Std. 1012, and the IEEE Std. 1028 are applied systematically and strictly at each lifecycle phase. Checklist-based Fagan Inspection has mainly been applied for requirements V and V while model checking is applied for formal verification and HAZOP is applied for identification of safety requirements. Checklist-based V and V procedure was very effective for systematic requirements V and V of OS software, and the applied V and V techniques and their tools in requirements V and V can also be applied for systematic design V and V of OS software.

  14. Validation of the Monte Carlo Criticality Program KENO V. a for highly-enriched uranium systems

    Energy Technology Data Exchange (ETDEWEB)

    Knight, J.R.

    1984-11-01

    A series of calculations based on critical experiments have been performed using the KENO V.a Monte Carlo Criticality Program for the purpose of validating KENO V.a for use in evaluating Y-12 Plant criticality problems. The experiments were reflected and unreflected systems of single units and arrays containing highly enriched uranium metal or uranium compounds. Various geometrical shapes were used in the experiments. The SCALE control module CSAS25 with the 27-group ENDF/B-4 cross-section library was used to perform the calculations. Some of the experiments were also calculated using the 16-group Hansen-Roach Library. Results are presented in a series of tables and discussed. Results show that the criteria established for the safe application of the KENO IV program may also be used for KENO V.a results.

  15. Validation and simulation of a regulated survey system through Monte Carlo techniques

    Directory of Open Access Journals (Sweden)

    Asier Lacasta Soto

    2015-07-01

    Full Text Available Channel flow covers long distances and obeys to variable temporal behaviour. It is usually regulated by hydraulic elements as lateralgates to provide a correct of water supply. The dynamics of this kind of flow is governed by a partial differential equations systemnamed shallow water model. They have to be complemented with a simplified formulation for the gates. All the set of equations forma non-linear system that can only be solved numerically. Here, an explicit upwind numerical scheme in finite volumes able to solveall type of flow regimes is used. Hydraulic structures (lateral gates formulation introduces parameters with some uncertainty. Hence,these parameters will be calibrated with a Monte Carlo algorithm obtaining associated coefficients to each gate. Then, they will bechecked, using real cases provided by the monitorizing equipment of the Pina de Ebro channel located in Zaragoza.

  16. The COSMO-LEPS mesoscale ensemble system: validation of the methodology and verification

    Directory of Open Access Journals (Sweden)

    C. Marsigli

    2005-01-01

    Full Text Available The limited-area ensemble prediction system COSMO-LEPS has been running every day at ECMWF since November 2002. A number of runs of the non-hydrostatic limited-area model Lokal Modell (LM are available every day, nested on members of the ECMWF global ensemble. The limited-area ensemble forecasts range up to 120h and LM-based probabilistic products are disseminated to several national and regional weather services. Some changes of the operational suite have recently been made, on the basis of the results of a statistical analysis of the methodology. The analysis is presented in this paper, showing the benefit of increasing the number of ensemble members. The system has been designed to have a probabilistic support at the mesoscale, focusing the attention on extreme precipitation events. In this paper, the performance of COSMO-LEPS in forecasting precipitation is presented. An objective verification in terms of probabilistic indices is made, using a dense network of observations covering a part of the COSMO domain. The system is compared with ECMWF EPS, showing an improvement of the limited-area high-resolution system with respect to the global ensemble system in the forecast of high precipitation values. The impact of the use of different schemes for the parametrisation of the convection in the limited-area model is also assessed, showing that this have a minor impact with respect to run the model with different initial and boundary condition.

  17. Verification and validation guidelines for high integrity systems: Appendices A--D, Volume 2

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    The following material is furnished as an experimental guide for the use of risk based classification for nuclear plant protection systems. As shown in Sections 2 and 3 of this report, safety classifications for the nuclear field are application based (using the function served as the primary criterion), whereas those in use by the process industry and the military are risk based. There are obvious obstacles to the use of risk based classifications (and the associated integrity levels) for nuclear power plants, yet there are also many potential benefits, including: it considers all capabilities provided for dealing with a specific hazard, thus assigning a lower risk where multiple protection is provided (either at the same or at lower layers); this permits the plant management to perform trade-offs between systems that meet the highest qualification levels or multiple diverse systems at lower qualification levels; it motivates the use (and therefore also the development) of protection systems with demonstrated low failure probability; and it may permit lower cost process industry equipment of an established integrity level to be used in nuclear applications (subject to verification of the integrity level and regulatory approval). The totality of these benefits may reduce the cost of digital protection systems significantly an motivate utilities to much more rapid upgrading of the capabilities than is currently the case. Therefore the outline of a risk based classification is presented here, to serve as a starting point for further investigation and possible trial application

  18. Monte Carlo Techniques for the Comprehensive Modeling of Isotopic Inventories in Future Nuclear Systems and Fuel Cycles. Final Report

    International Nuclear Information System (INIS)

    Paul P.H. Wilson

    2005-01-01

    The development of Monte Carlo techniques for isotopic inventory analysis has been explored in order to facilitate the modeling of systems with flowing streams of material through varying neutron irradiation environments. This represents a novel application of Monte Carlo methods to a field that has traditionally relied on deterministic solutions to systems of first-order differential equations. The Monte Carlo techniques were based largely on the known modeling techniques of Monte Carlo radiation transport, but with important differences, particularly in the area of variance reduction and efficiency measurement. The software that was developed to implement and test these methods now provides a basis for validating approximate modeling techniques that are available to deterministic methodologies. The Monte Carlo methods have been shown to be effective in reproducing the solutions of simple problems that are possible using both stochastic and deterministic methods. The Monte Carlo methods are also effective for tracking flows of materials through complex systems including the ability to model removal of individual elements or isotopes in the system. Computational performance is best for flows that have characteristic times that are large fractions of the system lifetime. As the characteristic times become short, leading to thousands or millions of passes through the system, the computational performance drops significantly. Further research is underway to determine modeling techniques to improve performance within this range of problems. This report describes the technical development of Monte Carlo techniques for isotopic inventory analysis. The primary motivation for this solution methodology is the ability to model systems of flowing material being exposed to varying and stochastically varying radiation environments. The methodology was developed in three stages: analog methods which model each atom with true reaction probabilities (Section 2), non-analog methods

  19. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    Science.gov (United States)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  20. Quality control of the treatment planning systems dose calculations in external radiation therapy using the Penelope Monte Carlo code; Controle qualite des systemes de planification dosimetrique des traitements en radiotherapie externe au moyen du code Monte-Carlo Penelope

    Energy Technology Data Exchange (ETDEWEB)

    Blazy-Aubignac, L

    2007-09-15

    The treatment planning systems (T.P.S.) occupy a key position in the radiotherapy service: they realize the projected calculation of the dose distribution and the treatment duration. Traditionally, the quality control of the calculated distribution doses relies on their comparisons with dose distributions measured under the device of treatment. This thesis proposes to substitute these dosimetry measures to the profile of reference dosimetry calculations got by the Penelope Monte-Carlo code. The Monte-Carlo simulations give a broad choice of test configurations and allow to envisage a quality control of dosimetry aspects of T.P.S. without monopolizing the treatment devices. This quality control, based on the Monte-Carlo simulations has been tested on a clinical T.P.S. and has allowed to simplify the quality procedures of the T.P.S.. This quality control, in depth, more precise and simpler to implement could be generalized to every center of radiotherapy. (N.C.)

  1. The timing resolution of scintillation-detector systems: Monte Carlo analysis

    International Nuclear Information System (INIS)

    Choong, Woon-Seng

    2009-01-01

    Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use Monte Carlo analysis to model the physical processes (crystal geometry, crystal surface finish, scintillator rise time, scintillator decay time, photoelectron yield, PMT transit time spread, PMT single-electron response, amplifier response and time pick-off method) that can contribute to the timing resolution of scintillation-detector systems. In the Monte Carlo analysis, the photoelectron emissions are modeled by a rate function, which is used to generate the photoelectron time points. The rate function, which is simulated using Geant4, represents the combined intrinsic light emissions of the scintillator and the subsequent light transport through the crystal. The PMT output signal is determined by the superposition of the PMT single-electron response resulting from the photoelectron emissions. The transit time spread and the single-electron gain variation of the PMT are modeled in the analysis. Three practical time pick-off methods are considered in the analysis. Statistically, the best timing resolution is achieved with the first photoelectron timing. The calculated timing resolution suggests that a leading edge discriminator gives better timing performance than a constant fraction discriminator and produces comparable results when a two-threshold or three-threshold discriminator is used. For a typical PMT, the effect of detector noise on the timing resolution is negligible. The calculated timing resolution is found to improve with increasing mean photoelectron yield, decreasing scintillator decay time and

  2. Portable system for periodical verification of area monitors for neutrons; Sistema portatil para verificacao periodica de monitores de area para neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons

    2009-07-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  3. A formal design verification and validation on the human factors of a computerized information system in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Park, Jae Chang; Cheon, Se Woo; Jung, Kwang Tae; Baek, Seung Min; Han, Seung; Park, Hee Suk; Son, Ki Chang; Kim, Jung Man; Jung Yung Woo

    1999-11-01

    This report describe a technical transfer under the title of ''A formal design verification and validation on the human factors of a computerized information system in nuclear power plants''. Human factors requirements for the information system designs are extracted from various regulatory and industrial standards and guidelines, and interpreted into a more specific procedures and checklists for verifying the satisfaction of those requirements. A formalized implementation plan is established for human factors verification and validation of a computerized information system in nuclear power plants. Additionally, a Computer support system, named as DIMS-web (design Issue Management System), is developed based upon web internet environment so as to enhance the implementation of the human factors activities. DIMS-Web has three maine functions: supporting requirements review, tracking design issues, and management if issues screening evaluation. DIMS-Web shows its benefits in practice through a trial application to the design review of CFMS for YGN nuclear unit 5 and 6. (author)

  4. Software Verification and Validation Report for the 244-AR Vault Interim Stabilization Ventilation System

    International Nuclear Information System (INIS)

    YEH, T.

    2002-01-01

    This document reports on the analysis, testing and conclusions of the software verification and validation for the 244-AR Vault Interim Stabilization ventilation system. Automation control system will use the Allen-Bradley software tools for programming and programmable logic controller (PLC) configuration. The 244-AR Interim Stabilization Ventilation System will be used to control the release of radioactive particles to the environment in the containment tent, located inside the canyon of the 244-AR facility, and to assist the waste stabilization efforts. The HVAC equipment, ducts, instruments, PLC hardware, the ladder logic executable software (documented code), and message display terminal are considered part of the temporary ventilation system. The system consists of a supply air skid, temporary ductwork (to distribute airflow), and two skid-mounted, 500-cfm exhausters connected to the east filter building and the vessel vent system. The Interim Stabilization Ventilation System is a temporary, portable ventilation system consisting of supply side and exhaust side. Air is supplied to the containment tent from an air supply skid. This skid contains a constant speed fan, a pre-filter, an electric heating coil, a cooling coil, and a constant flow device (CFD). The CFD uses a passive component that allows a constant flow of air to pass through the device. Air is drawn out of the containment tent, cells, and tanks by two 500-cfm exhauster skids running in parallel. These skids are equipped with fans, filters, stack, stack monitoring instrumentation, and a PLC for control. The 500CFM exhaust skids were fabricated and tested previously for saltwell pumping activities. The objective of the temporary ventilation system is to maintain a higher pressure to the containment tent, relative to the canyon and cell areas, to prevent contaminants from reaching the containment tent

  5. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  6. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    Science.gov (United States)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  7. A Monte Carlo dosimetric quality assurance system for dynamic intensity-modulated radiotherapy

    International Nuclear Information System (INIS)

    Takegawa, Hideki; Yamamoto, Tokihiro; Miyabe, Yuki; Teshima, Teruki; Kunugi, Tomoaki; Yano, Shinsuke; Mizowaki, Takashi; Nagata, Yasushi; Hiraoka, Masahiro

    2005-01-01

    We are developing a Monte Carlo (MC) dose calculation system, which can resolve dosimetric issues derived from multileaf collimator (MLC) design for routine dosimetric quality assurance (QA) of intensity-modulated radiotherapy (IMRT). The treatment head of the medical linear accelerator equipped with MLC was modeled using the EGS4 MC code. A graphical user interface (GUI) application was developed to implement MC dose computation in the CT-based patient model and compare the MC calculated results with those of a commercial radiotherapy treatment planning (RTP) system, Varian Eclipse. To reduce computation time, the EGS4 MC code has been parallelized on massive parallel processing (MPP) system using the message passing interface (MPI). The MC treatment head model and MLC model were validated by the measurement data sets of percentage depth dose (PDD) and off-center ratio (OCR) in the water phantom and the film measurements for the static and dynamic test patterns, respectively. In the treatment head model, the MC calculated results agreed with those of measurements for both of PDD and OCR. The MC could reproduce all of the MLC dosimetric effects. A quantitative comparison between the results of MC and Eclipse was successfully performed with the GUI application. Parallel speed-up became almost linear. An MC dosimetric QA system for dynamic IMRT has been developed, however there were large dose discrepancies between the MC and the measurement in the MLC model simulation, which are now being investigated. (author)

  8. Structure and atomic correlations in molecular systems probed by XAS reverse Monte Carlo refinement

    Science.gov (United States)

    Di Cicco, Andrea; Iesari, Fabio; Trapananti, Angela; D'Angelo, Paola; Filipponi, Adriano

    2018-03-01

    The Reverse Monte Carlo (RMC) algorithm for structure refinement has been applied to x-ray absorption spectroscopy (XAS) multiple-edge data sets for six gas phase molecular systems (SnI2, CdI2, BBr3, GaI3, GeBr4, GeI4). Sets of thousands of molecular replicas were involved in the refinement process, driven by the XAS data and constrained by available electron diffraction results. The equilibrated configurations were analysed to determine the average tridimensional structure and obtain reliable bond and bond-angle distributions. Detectable deviations from Gaussian models were found in some cases. This work shows that a RMC refinement of XAS data is able to provide geometrical models for molecular structures compatible with present experimental evidence. The validation of this approach on simple molecular systems is particularly important in view of its possible simple extension to more complex and extended systems including metal-organic complexes, biomolecules, or nanocrystalline systems.

  9. Control Performance Verification of Power System Stabilizer with an EDLC in Islanded Microgrid

    Science.gov (United States)

    Tanabe, Takayuki; Suzuki, Shigeyuki; Ueda, Yoshinobu; Ito, Takamitsu; Numata, Shigeo; Shimoda, Eisuke; Funabashi, Toshihisa; Yokoyama, Ryuichi

    We developed a power system stabilizer with an EDLC (electric double layer capacitor) that makes it possible to operate microgrids autonomously from utility grids, and to maintain the electric power quality in an islanded microgrid. This paper proposes two types of control systems that are composed of a PFC (power flow compensator) and a CVCF (constant voltage constant frequency) compensator. Installation locations of this system with the CVCF compensator are not limited by hardware requirements, and can maintain the quality of electricity in the islanded microgrid. Also, it is possible for the CVCF compensator to manage a dynamic load sharing function. Therefore, it is not always necessary for this equipment to have the central controller by using information networks. The EDLC is capable of charging and discharging stored electricity at a short cycle repetitively and has an advantage to keep a storage resource minimum to maintain the electric power quality. This paper shows specifications and verification results obtained by simulation studies and by demonstrating experiments of this equipment. In order to verify the practicability of the proposed control, these experiments were carried out in the microgrid that is supplying electric power to actual loads.

  10. Validation and Verification of Future Integrated Safety-Critical Systems Operating under Off-Nominal Conditions

    Science.gov (United States)

    Belcastro, Christine M.

    2010-01-01

    Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.

  11. Development of verification program for safety evaluation of KNGR on-site and off-site power system design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kem Joong; Ryu, Eun Sook; Choi, Jang Hong; Lee, Byung Il; Han, Hyun Kyu; Oh, Seong Kyun; Kim, Han Kee; Park, Chul Woo; Kim, Min Jeong [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-04-15

    In order to verify the adequacy of the design and analysis of the on-site and off-site power system, we developed the regulatory analysis program. We established the methodology for electric power system and constructed the algorithm of steady-state load flow analysis, fault analysis, transient stability analysis. The developed program to be an advantage of GUI and C++ programming technique. The design of input made easy to access the common use PSS/E format and that of output made users to work with Excel spreadsheet. The performance of program was verified to compare with PSS/E results. The case studies as follows. The verification of load flow analysis of KNGR on-site power system. The evaluation of load flow and transient stability analysis of off-site power system of KNGR. The verification of load flow and transient stability analysis. The frequency drop analysis of loss of generation.

  12. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  13. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  14. Improving Power System Risk Evaluation Method Using Monte Carlo Simulation and Gaussian Mixture Method

    Directory of Open Access Journals (Sweden)

    GHAREHPETIAN, G. B.

    2009-06-01

    Full Text Available The analysis of the risk of partial and total blackouts has a crucial role to determine safe limits in power system design, operation and upgrade. Due to huge cost of blackouts, it is very important to improve risk assessment methods. In this paper, Monte Carlo simulation (MCS was used to analyze the risk and Gaussian Mixture Method (GMM has been used to estimate the probability density function (PDF of the load curtailment, in order to improve the power system risk assessment method. In this improved method, PDF and a suggested index have been used to analyze the risk of loss of load. The effect of considering the number of generation units of power plants in the risk analysis has been studied too. The improved risk assessment method has been applied to IEEE 118 bus and the network of Khorasan Regional Electric Company (KREC and the PDF of the load curtailment has been determined for both systems. The effect of various network loadings, transmission unavailability, transmission capacity and generation unavailability conditions on blackout risk has been investigated too.

  15. Quality assurance and verification of the MACCS [MELCOR Accident Consequence Code System] code, Version 1.5

    International Nuclear Information System (INIS)

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E.

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs

  16. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    National Research Council Canada - National Science Library

    Broy, Manfred; Leucker, Martin

    2007-01-01

    The objective of this project is the development of an integrated suite of technologies focusing on end-to-end software development supporting requirements analysis, design, implementation, and verification...

  17. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  18. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  19. IMPACTS OF TIMBER LEGALITY VERIFICATION SYSTEM IMPLEMENTATION ON THE SUSTAINABILITY OF TIMBER INDUSTRY AND PRIVATE FOREST

    Directory of Open Access Journals (Sweden)

    Elvida Yosefi Suryandari

    2017-04-01

    Full Text Available International market requires producers to proof the legality of their wood products to address the issues of illegal logging and illegal trade. Timber Legality Verification System (TLVS has been prepared by the Government of Indonesia that covering the upstream and downstream wood industries. This paper aims to evaluate gaps in the implementation of TLVS policy and its impact on the sustainability of timber industry. This study was using gap, descriptive and costs-structure analyzes. The study was conducted in three provinces, namely: DKI Jakarta, West Java and D.I. Yogyakarta. Research found that the effectiveness of the TLVS implementation was low due to relatively rapid policy changes. This situation became disincetive for investments in timber business. Private sector perceived that TLVS policy should be applied in the upstream of timber business. Hence, the industry and market in the downstream have not been fully support to this system. Furthermore, TLVS policy implementation was considered ineffective by timber industry as well as private forest managers, especially by micro industry and smallholder private forests. This situation threatened the sustainability of timber industry and private forests. Therefore, Institutions should be strengthened in order to improve the quality of human resources and the competitiveness of products.

  20. Understanding Measurement Reporting and Verification Systems for REDD+ as an Investment for Generating Carbon Benefits

    Directory of Open Access Journals (Sweden)

    Giulio Di Lallo

    2017-07-01

    Full Text Available Reducing emissions from forests—generating carbon credits—in return for REDD+ (Reducing Emissions from Deforestation and forest Degradation payments represents a primary objective of forestry and development projects worldwide. Setting reference levels (RLs, establishing a target for emission reductions from avoided deforestation and degradation, and implementing an efficient monitoring system underlie effective REDD+ projects, as they are key factors that affect the generation of carbon credits. We analyzed the interdependencies among these factors and their respective weights in generating carbon credits. Our findings show that the amounts of avoided emissions under a REDD+ scheme mainly vary according to the monitoring technique adopted; nevertheless, RLs have a nearly equal influence. The target for reduction of emissions showed a relatively minor impact on the generation of carbon credits, particularly when coupled with low RLs. Uncertainties in forest monitoring can severely undermine the derived allocation of benefits, such as the REDD+ results-based payments to developing countries. Combining statistically-sound sampling designs with Lidar data provides a means to reduce uncertainties and likewise increases the amount of accountable carbon credits that can be claimed. This combined approach requires large financial resources; we found that results-based payments can potentially pay-off the necessary investment in technologies that would enable accurate and precise estimates of activity data and emission factors. Conceiving of measurement, reporting and verification (MRV systems as investments is an opportunity for tropical countries in particular to implement well-defined, long-term forest monitoring strategies.

  1. Verification of an interaction model of an ultrasonic oscillatory system with periodontal tissues

    Directory of Open Access Journals (Sweden)

    V. A. Karpuhin

    2014-01-01

    Full Text Available Verification of an interaction model of an ultrasonic oscillatory system with biological tissues which was developed in COMSOL Multiphysics was carried out. It was shown that calculation results in COMSOL Multiphysics obtained using the “Finer” grid (the ratio of the grid step to a minimum transversal section area of the model ≤ 0.3 mm-1 best of all qualitatively and quantitatively corresponded to practical results. The average relative error of the obtained results in comparison with the experimental ones did not exceed 4.0%. Influence of geometrical parameters (thickness of load on electrical admittance of the ultrasonic oscillatory system interacting with biological tissues was investigated. It was shown that increase in thickness of load within the range from 0 to 95 mm led to decrease in calculated values of natural resonance frequency of longitudinal fluctuations and electrical admittance from 26,58 to 26,35 kHz and from 0,86 to 0,44 mS.

  2. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  3. A program for the independent verification of brachytherapy planning system calculations

    Directory of Open Access Journals (Sweden)

    Facundo Ballester

    2010-10-01

    Full Text Available Purpose: In this work a spreadsheet based program is presented that to a large extent independently verifies the calculations of individual plans of brachytherapy treatment planning systems for low dose rate, high dose rate and pulsed dose rate techniques.Material and methods: The verification program has been developed based on workbooks/spreadsheets. The treatment planning system output text files are automatically loaded into the new program, allowing the use of the source coordinates, the desired calculation point coordinates, and the dwell times of a patient plan. The source strength and the reference dates are entered by the user and then dose points calculations are independently performed. The programshows its results in a comparison of its calculated point dose data with the corresponding TPS outcome.Results: Results of 250 clinical cases show agreement with the TPS outcome within a 2% level.Conclusions: The program allows the implementation of the recommendations to verify the clinical brachytherapy dosimetry in a simple and accurate way, in only few minutes and with a minimum of user interactions.

  4. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Directory of Open Access Journals (Sweden)

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  5. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  6. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration data...... making the method easy for the railway engineers to use. Furthermore, the method features a 4-step verification and validation approach that can be integrated naturally into different phases of the software development process. This 4-step approach identifies possible errors in generic applications...... or configuration data as early as possible in the software development cycle, and facilitates debugging/troubleshooting if errors are discovered. The proposed method has successfully been applied to case studies of the forthcoming Danish railway interlocking systems that are compatible with the European...

  7. Development and verification of a modular program system for calculation of the long-time behavior of pressurized water reactors

    International Nuclear Information System (INIS)

    Woerner, A.

    1984-01-01

    For the fuel management of nuclear power plants the time dependent fuel composition and power distribution within the reactor core has to be determined with high accuracy. The calculation of this long-time behaviour of the pressurized water reactors, taking into account the thermohydraulic feedback on the above mentioned distributions, has been integrated in a modular program system. The essential part of this program system is a newly developped nodal diffusion theory program which allows to describe approximately the local variation of cross sections within a node. The above mentioned computational method, making use of latest cross section data, permits reactor cycle calculations with high accuracy. This has been demonstrated by the verification of two reactor cycles of the Kernkraftwerk Obrigheim. In addition to this physical verification a benchmark problem comparison has been accomplished to prove the efficiency of the modular program system. (orig./HP) [de

  8. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  9. Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy

    International Nuclear Information System (INIS)

    Testa, M.; Schümann, J.; Lu, H.-M.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2013-01-01

    Purpose: TOPAS (TOol for PArticle Simulation) is a particle simulation code recently developed with the specific aim of making Monte Carlo simulations user-friendly for research and clinical physicists in the particle therapy community. The authors present a thorough and extensive experimental validation of Monte Carlo simulations performed with TOPAS in a variety of setups relevant for proton therapy applications. The set of validation measurements performed in this work represents an overall end-to-end testing strategy recommended for all clinical centers planning to rely on TOPAS for quality assurance or patient dose calculation and, more generally, for all the institutions using passive-scattering proton therapy systems. Methods: The authors systematically compared TOPAS simulations with measurements that are performed routinely within the quality assurance (QA) program in our institution as well as experiments specifically designed for this validation study. First, the authors compared TOPAS simulations with measurements of depth-dose curves for spread-out Bragg peak (SOBP) fields. Second, absolute dosimetry simulations were benchmarked against measured machine output factors (OFs). Third, the authors simulated and measured 2D dose profiles and analyzed the differences in terms of field flatness and symmetry and usable field size. Fourth, the authors designed a simple experiment using a half-beam shifter to assess the effects of multiple Coulomb scattering, beam divergence, and inverse square attenuation on lateral and longitudinal dose profiles measured and simulated in a water phantom. Fifth, TOPAS’ capabilities to simulate time dependent beam delivery was benchmarked against dose rate functions (i.e., dose per unit time vs time) measured at different depths inside an SOBP field. Sixth, simulations of the charge deposited by protons fully stopping in two different types of multilayer Faraday cups (MLFCs) were compared with measurements to benchmark the

  10. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    Science.gov (United States)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  11. Monte Carlo simulations of a novel coherent scatter materials discrimination system

    Science.gov (United States)

    Hassan, Laila; Starr-Baier, Sean; MacDonald, C. A.; Petruccelli, Jonathan C.

    2017-05-01

    X-ray coherent scatter imaging has the potential to improve the detection of liquid and powder materials of concern in security screening. While x-ray attenuation is dependent on atomic number, coherent scatter is highly dependent on the characteristic angle for the target material, and thus offers an additional discrimination. Conventional coherent scatter analysis requires pixel-by-pixel scanning, and so could be prohibitively slow for security applications. A novel slot scan system has been developed to provide rapid imaging of the coherent scatter at selected angles of interest, simultaneously with the conventional absorption images. Prior experimental results showed promising capability. In this work, Monte Carlo simulations were performed to assess discrimination capability and provide system optimization. Simulation analysis performed using the measured ring profiles for an array of powders and liquids, including water, ethanol and peroxide. For example, simulations yielded a signal-to-background ratio of 1.63+/-0.08 for a sample consisting of two 10 mm diameter vials, one containing ethanol (signal) and one water (background). This high SBR value is due to the high angular separation of the coherent scatter between the two liquids. The results indicate that the addition of coherent scatter information to single or dual energy attenuation images improves the discrimination of materials of interest.

  12. Monte Carlo based treatment planning systems for Boron Neutron Capture Therapy in Petten, The Netherlands

    Energy Technology Data Exchange (ETDEWEB)

    Nievaart, V A; Daquino, G G; Moss, R L [JRC European Commission, PO Box 2, 1755ZG Petten (Netherlands)

    2007-06-15

    Boron Neutron Capture Therapy (BNCT) is a bimodal form of radiotherapy for the treatment of tumour lesions. Since the cancer cells in the treatment volume are targeted with {sup 10}B, a higher dose is given to these cancer cells due to the {sup 10}B(n,{alpha}){sup 7}Li reaction, in comparison with the surrounding healthy cells. In Petten (The Netherlands), at the High Flux Reactor, a specially tailored neutron beam has been designed and installed. Over 30 patients have been treated with BNCT in 2 clinical protocols: a phase I study for the treatment of glioblastoma multiforme and a phase II study on the treatment of malignant melanoma. Furthermore, activities concerning the extra-corporal treatment of metastasis in the liver (from colorectal cancer) are in progress. The irradiation beam at the HFR contains both neutrons and gammas that, together with the complex geometries of both patient and beam set-up, demands for very detailed treatment planning calculations. A well designed Treatment Planning System (TPS) should obey the following general scheme: (1) a pre-processing phase (CT and/or MRI scans to create the geometric solid model, cross-section files for neutrons and/or gammas); (2) calculations (3D radiation transport, estimation of neutron and gamma fluences, macroscopic and microscopic dose); (3) post-processing phase (displaying of the results, iso-doses and -fluences). Treatment planning in BNCT is performed making use of Monte Carlo codes incorporated in a framework, which includes also the pre- and post-processing phases. In particular, the glioblastoma multiforme protocol used BNCT{sub r}tpe, while the melanoma metastases protocol uses NCTPlan. In addition, an ad hoc Positron Emission Tomography (PET) based treatment planning system (BDTPS) has been implemented in order to integrate the real macroscopic boron distribution obtained from PET scanning. BDTPS is patented and uses MCNP as the calculation engine. The precision obtained by the Monte Carlo

  13. The Integrated Safety Management System Verification Enhancement Review of the Plutonium Finishing Plant (PFP)

    International Nuclear Information System (INIS)

    BRIGGS, C.R.

    2000-01-01

    The primary purpose of the verification enhancement review was for the DOE Richland Operations Office (RL) to verify contractor readiness for the independent DOE Integrated Safety Management System Verification (ISMSV) on the Plutonium Finishing Plant (PFP). Secondary objectives included: (1) to reinforce the engagement of management and to gauge management commitment and accountability; (2) to evaluate the ''value added'' benefit of direct public involvement; (3) to evaluate the ''value added'' benefit of direct worker involvement; (4) to evaluate the ''value added'' benefit of the panel-to-panel review approach; and, (5) to evaluate the utility of the review's methodology/adaptability to periodic assessments of ISM status. The review was conducted on December 6-8, 1999, and involved the conduct of two-hour interviews with five separate panels of individuals with various management and operations responsibilities related to PFP. A semi-structured interview process was employed by a team of five ''reviewers'' who directed open-ended questions to the panels which focused on: (1) evidence of management commitment, accountability, and involvement; and, (2) consideration and demonstration of stakeholder (including worker) information and involvement opportunities. The purpose of a panel-to-panel dialogue approach was to better spotlight: (1) areas of mutual reinforcement and alignment that could serve as good examples of the management commitment and accountability aspects of ISMS implementation, and, (2) areas of potential discrepancy that could provide opportunities for improvement. In summary, the Review Team found major strengths to include: (1) the use of multi-disciplinary project work teams to plan and do work; (2) the availability and broad usage of multiple tools to help with planning and integrating work; (3) senior management presence and accessibility; (4) the institutionalization of worker involvement; (5) encouragement of self-reporting and self

  14. Information Systems Verification and Validation: Three Perspectives: Technical, Systemic, and Philosophical (Participatory Panel "Active Learning Tools"

    Directory of Open Access Journals (Sweden)

    Sushil Acharya

    2016-10-01

    Full Text Available Software quality is a crucial issue in software development. As software has become ubiquitous, software products have become critical. Software quality issues poses a problem in the software industry, as there is generally a lack of knowledge of Software Verification and Validation (V&V benefits and a shortage of adequately trained V&V practitioners. The fundamental challenge towards a solution to improve software quality lies in the people and processes that develop and produce software. The industry desires new hires to know software development best practices so as to be able to perform from day 1. This means new hires are expected to know software processes, methods, and tools. This is where the academia needs to step in, especially those that focus on applied teaching. The academia has to develop necessary course modules and redesign their curriculum to provide graduating students the applied knowledge they need to so as to be competitive in the job market. Through a project funded by the National Science Foundation. the author's team has developed (42 delivery hours of Software V&V course modules. This development activity has embraced academia, industry partnership. These tools have been successfully disseminated to over 24 universities with many CS, IS, SE programs incorporating the tools in their existing courses and others designing new courses based on these tools. The tool is available free of cost to interested academia and industry.

  15. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  16. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  17. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  18. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Science.gov (United States)

    Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.

    2013-09-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.

  19. Integration of SPICE with TEK LV500 ASIC Design Verification System

    Directory of Open Access Journals (Sweden)

    A. Srivastava

    1996-01-01

    Full Text Available The present work involves integration of the simulation stage of design of a VLSI circuit and its testing stage. The SPICE simulator, TEK LV500 ASIC Design Verification System, and TekWaves, a test program generator for LV500, were integrated. A software interface in ‘C’ language in UNIX ‘solaris 1.x’ environment has been developed between SPICE and the testing tools (TekWAVES and LV500. The function of the software interface developed is multifold. It takes input from either SPICE2G.6 or SPICE 3e.1. The output generated by the interface software can be given as an input to either TekWAVES or LV500. A graphical user interface has also been developed with OPENWlNDOWS using Xview tool kit on SUN workstation. As an example, a two phase clock generator circuit has been considered and usefulness of the software demonstrated. The interface software could be easily linked with VLSI design such as MAGIC layout editor.

  20. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)