Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Iraj Jabbari; Shahram Monadi
2015-01-01
A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation ...
Development and validation of MCNPX-based Monte Carlo treatment plan verification system.
Jabbari, Iraj; Monadi, Shahram
2015-01-01
A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.
Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Directory of Open Access Journals (Sweden)
Iraj Jabbari
2015-01-01
Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.
Monte Carlo simulations to replace film dosimetry in IMRT verification
Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assu...
Monte Carlo based verification of a beam model used in a treatment planning system
Wieslander, E.; Knöös, T.
2008-02-01
Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.
Schiapparelli, P; Zefiro, D; Taccini, G
2009-05-01
The aim of this work was to evaluate the performance of the voxel-based Monte Carlo algorithm implemented in the commercial treatment planning system ONCENTRA MASTERPLAN for a 9 MeV electron beam produced by a linear accelerator Varian Clinac 2100 C/D. In order to realize an experimental verification of the computed data, three different groups of tests were planned. The first set was performed in a water phantom to investigate standard fields, custom inserts, and extended treatment distances. The second one concerned standard field, irregular entrance surface, and oblique incidence in a homogeneous PMMA phantom. The last group involved the introduction of inhomogeneities in a PMMA phantom to simulate high and low density materials such as bone and lung. Measurements in water were performed by means of cylindrical and plane-parallel ionization chambers, whereas measurements in PMMA were carried out by the use of radiochromic films. Point dose values were compared in terms of percentage difference, whereas the gamma index tool was used to perform the comparison between computed and measured dose profiles, considering different tolerances according to the test complexity. In the case of transverse scans, the agreement was searched in the plane formed by the intersection of beam axis and the profile (2D analysis), while for percentage depth dose curves, only the beam axis was explored (1D analysis). An excellent agreement was found for point dose evaluation in water (discrepancies smaller than 2%). Also the comparison between planned and measured dose profiles in homogeneous water and PMMA phantoms showed good results (agreement within 2%-2 mm). Profile evaluation in phantoms with internal inhomogeneities showed a good agreement in the case of "lung" insert, while in tests concerning a small "bone" inhomogeneity, a discrepancy was particularly evidenced in dose values on the beam axis. This is due to the inaccurate geometrical description of the phantom that is linked
Molinelli, S.; Mairani, A.; Mirandola, A.; Vilches Freixas, G.; Tessonnier, T.; Giordanengo, S.; Parodi, K.; Ciocca, M.; Orecchia, R.
2013-06-01
During one year of clinical activity at the Italian National Center for Oncological Hadron Therapy 31 patients were treated with actively scanned proton beams. Results of patient-specific quality assurance procedures are presented here which assess the accuracy of a three-dimensional dose verification technique with the simultaneous use of multiple small-volume ionization chambers. To investigate critical cases of major deviations between treatment planning system (TPS) calculated and measured data points, a Monte Carlo (MC) simulation tool was implemented for plan verification in water. Starting from MC results, the impact of dose calculation, dose delivery and measurement set-up uncertainties on plan verification results was analyzed. All resulting patient-specific quality checks were within the acceptance threshold, which was set at 5% for both mean deviation between measured and calculated doses and standard deviation. The mean deviation between TPS dose calculation and measurement was less than ±3% in 86% of the cases. When all three sources of uncertainty were accounted for, simulated data sets showed a high level of agreement, with mean and maximum absolute deviation lower than 2.5% and 5%, respectively.
Monte Carlo simulations to replace film dosimetry in IMRT verification.
Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase.
Verification Account Management System (VAMS)
Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...
Distorted Fingerprint Verification System
Directory of Open Access Journals (Sweden)
Divya KARTHIKAESHWARAN
2011-01-01
Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.
Monte Carlo calculations supporting patient plan verification in proton therapy
Directory of Open Access Journals (Sweden)
Thiago Viana Miranda Lima
2016-03-01
Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are
Townson, Reid W
2013-01-01
Due to the increasing complexity of radiotherapy delivery, accurate dose verification has become an essential part of the clinical treatment process. The purpose of this work was to develop an electronic portal image (EPI) based pre-treatment verification technique capable of quickly reconstructing 3D dose distributions from both coplanar and non-coplanar treatments. The dose reconstruction is performed in a spherical water phantom by modulating, based on EPID measurements, pre-calculated Monte Carlo (MC) doselets defined on a spherical coordinate system. This is called the spherical doselet modulation (SDM) method. This technique essentially eliminates the statistical uncertainty of the MC dose calculations by exploiting both azimuthal symmetry in a patient-independent phase-space and symmetry of a virtual spherical water phantom. The symmetry also allows the number of doselets necessary for dose reconstruction to be reduced by a factor of about 250. In this work, 51 doselets were used. The SDM method mitiga...
Standard Verification System (SVS)
Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...
Formal Verification of Continuous Systems
DEFF Research Database (Denmark)
Sloth, Christoffer
2012-01-01
The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables and the verif...
Biometric Technologies and Verification Systems
Vacca, John R
2007-01-01
Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior
SU-E-T-578: MCEBRT, A Monte Carlo Code for External Beam Treatment Plan Verifications
Energy Technology Data Exchange (ETDEWEB)
Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States); Eldib, A [Fox Chase Cancer Center, Philadelphia, PA (United States); Al-Azhar University, Cairo (Egypt)
2014-06-01
Purpose: Present a new Monte Carlo code (MCEBRT) for patient-specific dose calculations in external beam radiotherapy. The code MLC model is benchmarked and real patient plans are re-calculated using MCEBRT and compared with commercial TPS. Methods: MCEBRT is based on the GEPTS system (Med. Phys. 29 (2002) 835–846). Phase space data generated for Varian linac photon beams (6 – 15 MV) are used as source term. MCEBRT uses a realistic MLC model (tongue and groove, rounded ends). Patient CT and DICOM RT files are used to generate a 3D patient phantom and simulate the treatment configuration (gantry, collimator and couch angles; jaw positions; MLC sequences; MUs). MCEBRT dose distributions and DVHs are compared with those from TPS in absolute way (Gy). Results: Calculations based on the developed MLC model closely matches transmission measurements (pin-point ionization chamber at selected positions and film for lateral dose profile). See Fig.1. Dose calculations for two clinical cases (whole brain irradiation with opposed beams and lung case with eight fields) are carried out and outcomes are compared with the Eclipse AAA algorithm. Good agreement is observed for the brain case (Figs 2-3) except at the surface where MCEBRT dose can be higher by 20%. This is due to better modeling of electron contamination by MCEBRT. For the lung case an overall good agreement (91% gamma index passing rate with 3%/3mm DTA criterion) is observed (Fig.4) but dose in lung can be over-estimated by up to 10% by AAA (Fig.5). CTV and PTV DVHs from TPS and MCEBRT are nevertheless close (Fig.6). Conclusion: A new Monte Carlo code is developed for plan verification. Contrary to phantombased QA measurements, MCEBRT simulate the exact patient geometry and tissue composition. MCEBRT can be used as extra verification layer for plans where surface dose and tissue heterogeneity are an issue.
Enumeration Verification System (EVS)
Social Security Administration — EVS is a batch application that processes for federal, state, local and foreign government agencies, private companies and internal SSA customers and systems. Each...
Vehicle usage verification system
Scanlon, William G.; McQuiston, Jonathan; Cotton, Simon L.
2012-01-01
EN)A computer-implemented system for verifying vehicle usage comprising a server capable of communication with a plurality of clients across a communications network. Each client is provided in a respective vehicle and with a respective global positioning system (GPS) by which the client can determi
US Agency for International Development — CVS is a system managed by OPM that is designed to be the primary tool for verifying whether or not there is an existing investigation on a person seeking security...
SU-E-T-761: TOMOMC, A Monte Carlo-Based Planning VerificationTool for Helical Tomotherapy
Energy Technology Data Exchange (ETDEWEB)
Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)
2015-06-15
Purpose: Present a new Monte Carlo code (TOMOMC) to calculate 3D dose distributions for patients undergoing helical tomotherapy treatments. TOMOMC performs CT-based dose calculations using the actual dynamic variables of the machine (couch motion, gantry rotation, and MLC sequences). Methods: TOMOMC is based on the GEPTS (Gama Electron and Positron Transport System) general-purpose Monte Carlo system (Chibani and Li, Med. Phys. 29, 2002, 835). First, beam models for the Hi-Art Tomotherpy machine were developed for the different beam widths (1, 2.5 and 5 cm). The beam model accounts for the exact geometry and composition of the different components of the linac head (target, primary collimator, jaws and MLCs). The beams models were benchmarked by comparing calculated Pdds and lateral/transversal dose profiles with ionization chamber measurements in water. See figures 1–3. The MLC model was tuned in such a way that tongue and groove effect, inter-leaf and intra-leaf transmission are modeled correctly. See figure 4. Results: By simulating the exact patient anatomy and the actual treatment delivery conditions (couch motion, gantry rotation and MLC sinogram), TOMOMC is able to calculate the 3D patient dose distribution which is in principal more accurate than the one from the treatment planning system (TPS) since it relies on the Monte Carlo method (gold standard). Dose volume parameters based on the Monte Carlo dose distribution can also be compared to those produced by the TPS. Attached figures show isodose lines for a H&N patient calculated by TOMOMC (transverse and sagittal views). Analysis of differences between TOMOMC and TPS is an ongoing work for different anatomic sites. Conclusion: A new Monte Carlo code (TOMOMC) was developed for Tomotherapy patient-specific QA. The next step in this project is implementing GPU computing to speed up Monte Carlo simulation and make Monte Carlo-based treatment verification a practical solution.
A Correlation-Based Fingerprint Verification System
Bazen, Asker M.; Verwaaijen, Gerben T.B.; Gerez, Sabih H.; Veelenturf, Leo P.J.; Zwaag, van der Berend Jan
2000-01-01
In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates i
Directory of Open Access Journals (Sweden)
Abhishek Jain
2012-12-01
Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.
Cognitive Bias in Systems Verification
Larson, Steve
2012-01-01
Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.
On Verification Modelling of Embedded Systems
Brinksma, Ed; Mader, Angelika
2004-01-01
Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio
Verification and Performance Analysis for Embedded Systems
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand
2009-01-01
This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....
On the verification of polynomial system solvers
Institute of Scientific and Technical Information of China (English)
Changbo CHEN; Marc MORENO MAZA; Wei PAN; Yuzhen XI
2008-01-01
We discuss the verification of mathematical software solving polynomial systems symbolically by way of triangular decomposition. Standard verification techniques are highly resource consuming and apply only to polynomial systems which are easy to solve. We exhibit a new approach which manipulates constructible sets represented by regular systems. We provide comparative benchmarks of different verification procedures applied to four solvers on a large set of well-known polynomial systems. Our experimental results illustrate the high effi-ciency of our new approach. In particular, we are able to verify triangular decompositions of polynomial systems which are not easy to solve.
Rosenfeld, Anatoly; Wroe, Andrew; Carolan, Martin; Cornelius, Iwan
2006-01-01
In hadron therapy the spectra of secondary particles can be very broad in type and energy. The most accurate calculations of tissue equivalent (TE) absorbed dose and biological effect can be achieved using Monte Carlo (MC) simulations followed by the application of an appropriate radiobiological model. The verification of MC simulations is therefore an important quality assurance (QA) issue in dose planning. We propose a method of verification for MC dose calculations based on measurements of either the integral absorbed dose or the spectra of deposited energies from single secondary particles in non-TE material detectors embedded in a target of interest (phantom). This method was tested in boron neutron capture therapy and fast neutron therapy beams.
Verification and Validation in Systems Engineering
Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay
2010-01-01
"Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model
Verification and Examination Management of Complex Systems
Directory of Open Access Journals (Sweden)
Stian Ruud
2014-10-01
Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.
Barbeiro, A. R.; Ureba, A.; Baeza, J. A.; Linares, R.; Perucha, M.; Jiménez-Ortega, E.; Velázquez, S.; Mateos, J. C.
2016-01-01
A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre
Institute of Scientific and Technical Information of China (English)
武祥; 若夕子; 于涛; 谢金森; 陈昊威
2014-01-01
TRITON couples multi group Monte Carlo Transport code KENO V. a and point-burnup code ORIGEN-S. It features adaptability on complex geometries,flexible processing ability on cross section and rapid calculating speed. Based on the thorium-based fuel cell benchmark of Idaho National Laboratory ( INL) ,the verification on TRITON burnup calcu-lation was performed,which showed good coincidence with the result of MOCUP code by INL. Furthermore, the results of burnup isotopes selection schemes in TRITON showed that,for thorium based fuel,only important nuclides on Th-U cycle was included,correct results can be obtained by TRITON. Conclusions in the present paper will support further applications of TRITON.%TRITON程序系统耦合了多群蒙特卡罗输运程序KENO V. a与点燃耗程序ORIGEN-S,具有几何适应性强、截面处理能力灵活、计算速度快等显著特点.本文基于爱达荷国家实验室( INL)钍基燃料元件燃耗基准题,开展了TRITON程序燃耗功能的验证,结果与INL采用MOCUP程序给出的结果吻合很好.同时,燃耗核素选取对TRITON计算结果的影响分析表明对于钍基燃料,只有在考虑Th-U循环重要核素的前提下,TRITON才能给出正确结果.上述结论为TRITON程序的应用奠定了基础.
Probabilistic Model for Dynamic Signature Verification System
Directory of Open Access Journals (Sweden)
Chai Tong Yuen
2011-11-01
Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.
Comparing formal verification approaches of interlocking systems
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus
2016-01-01
The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte;
2008-01-01
In order to assess the present and predict the future distribution system performance using a probabilistic model, verification of the model is crucial. This paper illustrates the error caused by using traditional Monte Carlo (MC) based probabilistic load flow (PLF) when involving tap...... obtained from the developed probabilistic model....
Combined experimental and Monte Carlo verification of brachytherapy plans for vaginal applicators
Sloboda, Ron S.; Wang, Ruqing
1998-12-01
Dose rates in a phantom around a shielded and an unshielded vaginal applicator containing Selectron low-dose-rate sources were determined by experiment and Monte Carlo simulation. Measurements were performed with thermoluminescent dosimeters in a white polystyrene phantom using an experimental protocol geared for precision. Calculations for the same set-up were done using a version of the EGS4 Monte Carlo code system modified for brachytherapy applications into which a new combinatorial geometry package developed by Bielajew was recently incorporated. Measured dose rates agree with Monte Carlo estimates to within 5% (1 SD) for the unshielded applicator, while highlighting some experimental uncertainties for the shielded applicator. Monte Carlo calculations were also done to determine a value for the effective transmission of the shield required for clinical treatment planning, and to estimate the dose rate in water at points in axial and sagittal planes transecting the shielded applicator. Comparison with dose rates generated by the planning system indicates that agreement is better than 5% (1 SD) at most positions. The precision thermoluminescent dosimetry protocol and modified Monte Carlo code are effective complementary tools for brachytherapy applicator dosimetry.
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2010-01-01
The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... of classical hybrid systems we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2012-01-01
The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...... of classical hybrid systems, we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Range verification methods in particle therapy: underlying physics and Monte Carlo modelling
Directory of Open Access Journals (Sweden)
Aafke Christine Kraan
2015-07-01
Full Text Available Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients.Non-invasive in-vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including beta+ emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC predictions is a key issue. Correctly modelling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modelling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.
Range Verification Methods in Particle Therapy: Underlying Physics and Monte Carlo Modeling.
Kraan, Aafke Christine
2015-01-01
Hadron therapy allows for highly conformal dose distributions and better sparing of organs-at-risk, thanks to the characteristic dose deposition as function of depth. However, the quality of hadron therapy treatments is closely connected with the ability to predict and achieve a given beam range in the patient. Currently, uncertainties in particle range lead to the employment of safety margins, at the expense of treatment quality. Much research in particle therapy is therefore aimed at developing methods to verify the particle range in patients. Non-invasive in vivo monitoring of the particle range can be performed by detecting secondary radiation, emitted from the patient as a result of nuclear interactions of charged hadrons with tissue, including β (+) emitters, prompt photons, and charged fragments. The correctness of the dose delivery can be verified by comparing measured and pre-calculated distributions of the secondary particles. The reliability of Monte Carlo (MC) predictions is a key issue. Correctly modeling the production of secondaries is a non-trivial task, because it involves nuclear physics interactions at energies, where no rigorous theories exist to describe them. The goal of this review is to provide a comprehensive overview of various aspects in modeling the physics processes for range verification with secondary particles produced in proton, carbon, and heavier ion irradiation. We discuss electromagnetic and nuclear interactions of charged hadrons in matter, which is followed by a summary of some widely used MC codes in hadron therapy. Then, we describe selected examples of how these codes have been validated and used in three range verification techniques: PET, prompt gamma, and charged particle detection. We include research studies and clinically applied methods. For each of the techniques, we point out advantages and disadvantages, as well as clinical challenges still to be addressed, focusing on MC simulation aspects.
Survey on Offline Finger Print Verification System
Suman, R.; Kaur, R.
2012-01-01
The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological (
Packaged low-level waste verification system
Energy Technology Data Exchange (ETDEWEB)
Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)
1995-12-31
The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.
Automated Formal Verification for PLC Control Systems
Fernández Adiego, Borja
2014-01-01
Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.
Formal verification of industrial control systems
CERN. Geneva
2015-01-01
Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif
Systems Approach to Arms Control Verification
Energy Technology Data Exchange (ETDEWEB)
Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M
2015-05-15
Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.
Parametric Verification of Weighted Systems
DEFF Research Database (Denmark)
Christoffersen, Peter; Hansen, Mikkel; Mariegaard, Anders
2015-01-01
This paper addresses the problem of parametric model checking for weighted transition systems. We consider transition systems labelled with linear equations over a set of parameters and we use them to provide semantics for a parametric version of weighted CTL where the until and next operators ar...... finitely many iterations. To demonstrate the utility of our technique, we have implemented a prototype tool that computes the constraints on parameters for model checking problems.......This paper addresses the problem of parametric model checking for weighted transition systems. We consider transition systems labelled with linear equations over a set of parameters and we use them to provide semantics for a parametric version of weighted CTL where the until and next operators...... are themselves indexed with linear equations. The parameters change the model-checking problem into a problem of computing a linear system of inequalities that characterizes the parameters that guarantee the satisfiability. To address this problem, we use parametric dependency graphs (PDGs) and we propose...
CTBT integrated verification system evaluation model supplement
Energy Technology Data Exchange (ETDEWEB)
EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.
2000-03-02
Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.
Formal System Verification for Trustworthy Embedded Systems
2011-04-19
running on does not expose covert channels between unconnected objects) and liveness properties under fairness assumptions. Both Spiessens’ and Murray’s...security properties. If the assumptions of the verification hold, we have mathematical proof that, among other properties, the seL4 kernel is free of...KeyKOS [4], EROS [12], and Amoeba [8] do not provide means to explicitly define capability distributions. Capabilities are 1objects -- The object
Palmprint Based Verification System Using SURF Features
Srinivas, Badrinath G.; Gupta, Phalguni
This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.
Verification and Validation of Flight Critical Systems Project
National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...
Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi
2014-06-01
This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.
SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM
Directory of Open Access Journals (Sweden)
E. V. Bulgakova
2016-03-01
Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.
A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2014-01-01
The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.
Formal development and verification of a distributed railway control system
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth; Peleska, J.
2000-01-01
The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic specificati......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... to decide whether it is safe for a train to move or for a point to be switched....
Verification and validation plan for the SFR system analysis module
Energy Technology Data Exchange (ETDEWEB)
Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)
2014-12-18
This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.
A Monte Carlo tool for combined photon and proton treatment planning verification
Energy Technology Data Exchange (ETDEWEB)
Seco, J [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Jiang, H [University of Arkansas for Medical Sciences, 4301 W. Markham Street, Little Rock, Arkansas 72202 USA (United States); Herrup, D [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Kooy, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)
2007-06-15
Photons and protons are usually used independently to treat cancer. However, at MGH patients can be treated with both photons and protons since both modalities are available on site. A combined therapy can be advantageous in cancer therapy due to the skin sparing ability of photons and the sharp Bragg peak fall-off for protons beyond the tumor. In the present work, we demonstrate how to implement a combined 3D MC toolkit for photon and proton (ph-pr) therapy, which can be used for verification of the treatment plan. The commissioning of a MC system for combined ph-pr involves initially the development of a MC model of both the photon and proton treatment heads. The MC dose tool was evaluated on a head and neck patient treated with both combined photon and proton beams. The combined ph-pr dose agreed with measurements in solid water phantom to within 3%mm. Comparison with commercial planning system pencil beam prediction agrees within 3% (except for air cavities and bone regions)
CTBT Integrated Verification System Evaluation Model
Energy Technology Data Exchange (ETDEWEB)
Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.
1997-10-01
Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.
Compositional verification of real-time systems using Ecdar
DEFF Research Database (Denmark)
David, A.; Larsen, K.G.; Møller, M.H.;
2012-01-01
We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...
[PIV: a computer-aided portal image verification system].
Fu, Weihua; Zhang, Hongzhi; Wu, Jing
2002-12-01
Portal image verification (PIV) is one of the key actions in QA procedure for sophisticated accurate radiotherapy. The purpose of this study was to develop a PIV software as a tool for improving the accuracy and visualization of portal field verification and computing field placement errors. PIV was developed in the visual C++ integrated environment under Windows 95 operating system. It can improve visualization by providing tools for image processing and multimode images display. Semi-automatic register methods make verification more accurate than view-box method. It can provide useful quantitative errors for regular fields. PIV is flexible and accurate. It is an effective tool for portal field verification.
Energy Technology Data Exchange (ETDEWEB)
Descalle, M-A; Chuang, C; Pouliot, J
2002-01-30
Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.
Integrated safety management system verification: Volume 2
Energy Technology Data Exchange (ETDEWEB)
Christensen, R.F.
1998-08-10
Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalization of an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR, 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System (ISMS). Guidance and expectations have been provided to PNNL by incorporation into the operating contract (Contract DE-ACM-76FL0 1830) and by letter. The contract requires that the contractor submit a description of their ISMS for approval by DOE. PNNL submitted their proposed Safety Management System Description for approval on November 25,1997. RL tentatively approved acceptance of the description pursuant to a favorable recommendation from this review. The Integrated Safety Management System Verification is a review of the adequacy of the ISMS description in fulfilling the requirements of the DEAR and the DOE Policy. The purpose of this review is to provide the Richland Operations Office Manager with a recommendation for approval of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and to verify the extent and maturity of ISMS implementation within the Laboratory. Further the review will provide a model for other DOE laboratories managed by the Office of Assistant Secretary for Energy Research.
Integrated safety management system verification: Volume 1
Energy Technology Data Exchange (ETDEWEB)
Christensen, R.F.
1998-08-12
Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System. The Manager, Richland Operations Office (RL), initiated a combined Phase 1 and Phase 2 Integrated Safety Management Verification review to confirm that PNNL had successfully submitted a description of their ISMS and had implemented ISMS within the laboratory facilities and processes. A combined review was directed by the Manager, RL, based upon the progress PNNL had made in the implementation of ISM. This report documents the results of the review conducted to verify: (1) that the PNNL integrated safety management system description and enabling documents and processes conform to the guidance provided by the Manager, RL; (2) that corporate policy is implemented by line managers; (3) that PNNL has provided tailored direction to the facility management; and (4) the Manager, RL, has documented processes that integrate their safety activities and oversight with those of PNNL. The general conduct of the review was consistent with the direction provided by the Under Secretary`s Draft Safety Management System Review and Approval Protocol. The purpose of this review was to provide the Manager, RL, with a recommendation to the adequacy of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and, to provide an evaluation of the extent and maturity of ISMS implementation within the Laboratory. Further, this review was intended to provide a model for other DOE Laboratories. In an effort to reduce the time and travel costs associated with ISM verification the team agreed to conduct preliminary training and orientation electronically and by phone. These
Formal Development and Verification of a Distributed Railway Control System
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth; Peleska, Jan
1999-01-01
In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... to move or for a point to be switched....
Formal Development and Verification of a Distributed Railway Control System
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth; Peleska, Jan
1998-01-01
In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... to move or for a point to be switched....
Verification of a timed multitask system with UPPAAL
Bel Mokadem, Houda; Berard, Béatrice; Gourcuff, Vincent; De Smet, Olivier; Roussel, Jean-Marc
2010-01-01
International audience; System and program verification has been a large area of research since the introduction of computers in industrial systems. It is an especially important issue for critical systems, where errors can cause human and financial damages. Programmable Logic Controllers (PLCs) are now widely used in many industrial systems and verification of the corresponding programs has already been studied in various contexts for a few years, for the benefit of users and system designer...
Development of Palmprint Verification System Using Biometrics
Institute of Scientific and Technical Information of China (English)
G. Shobha; M. Krishna; S.C. Sharma
2006-01-01
Palmprint verification system using Biometrics is one of the emerging technologies, which recognizes a person based on the principle lines, wrinkles and ridges on the surface of the palm. These line structures are stable and remain unchanged throughout the life of an individual. More importantly, no two palmprints from different individuals are the same, and normally people do not feel uneasy to have their palmprint images taken for testing. Therefore palmprint recognition offers a promising future for medium-security access control systems. In this paper, a new approach for personal authentication using hand images is discussed. Gray-Scale palm images are captured using a digital camera at a resolution of 640′480. Each of these gray-scale images is aligned and then used to extract palmprint and hand geometry features. These features are then used for authenticating users. The image acquisition setup used here is inherently simple and it does not employ any special illumination nor does it use any pegs that might cause any inconvenience to users. Experimental results show that the designed system achieves an acceptable level of performance.
Energy Technology Data Exchange (ETDEWEB)
Ortiz Lora, A.; Miras del Rio, H.; Terron Leon, J. A.
2013-07-01
Following the recommendations of the IAEA, and as a further check, they have been Monte Carlo simulation of each one of the plates that are arranged at the Hospital. The objective of the work is the verification of the certificates of calibration and intends to establish criteria of action for its acceptance. (Author)
Standard Verification System Lite (SVS Lite)
Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...
An Efficient Automatic Attendance System using Fingerprint Verification Technique
Chitresh Saraswat,; Amit Kumar
2010-01-01
The main aim of this paper is to develop an accurate, fast and very efficient automatic attendance system using fingerprint verification technique. We propose a system in which fingerprint verification is done by using extraction of minutiae technique and the system that automates the whole process of taking attendance, Manually which is a laborious and troublesome work and waste a lot of time, with its managing and maintaining the records for a period of time is also a burdensome task. For t...
Verification and Validation Issues in Systems of Systems
Directory of Open Access Journals (Sweden)
Eric Honour
2013-11-01
Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.
Modular Verification of Interactive Systems with an Application to Biology
Directory of Open Access Journals (Sweden)
P. Milazzo
2011-01-01
Full Text Available We propose sync-programs, an automata-based formalism for the description of biological systems, and a modular verification technique for such a formalism that allows properties expressed in the universal fragment of CTL to be verified on suitably chosen fragments of models, rather than on whole models. As an application we show the modelling of the lac operon regulation process and the modular verification of some properties. Verification of properties is performed by using the NuSMV model checker and we show that by applying our modular verification technique we can verify properties in shorter times than those necessary to verify the same properties in the whole model.
Energy Technology Data Exchange (ETDEWEB)
Petoukhova, A L; Van Wingerden, K; Wiggenraad, R G J; Van de Vaart, P J M; Van Egmond, J; Franken, E M; Van Santvoort, J P C, E-mail: a.petoukhova@mchaaglanden.n [Radiotherapy Centre West, PO Box 432, NL-2501 CK, The Hague (Netherlands)
2010-08-21
This study presents data for verification of the iPlan RT Monte Carlo (MC) dose algorithm (BrainLAB, Feldkirchen, Germany). MC calculations were compared with pencil beam (PB) calculations and verification measurements in phantoms with lung-equivalent material, air cavities or bone-equivalent material to mimic head and neck and thorax and in an Alderson anthropomorphic phantom. Dosimetric accuracy of MC for the micro-multileaf collimator (MLC) simulation was tested in a homogeneous phantom. All measurements were performed using an ionization chamber and Kodak EDR2 films with Novalis 6 MV photon beams. Dose distributions measured with film and calculated with MC in the homogeneous phantom are in excellent agreement for oval, C and squiggle-shaped fields and for a clinical IMRT plan. For a field with completely closed MLC, MC is much closer to the experimental result than the PB calculations. For fields larger than the dimensions of the inhomogeneities the MC calculations show excellent agreement (within 3%/1 mm) with the experimental data. MC calculations in the anthropomorphic phantom show good agreement with measurements for conformal beam plans and reasonable agreement for dynamic conformal arc and IMRT plans. For 6 head and neck and 15 lung patients a comparison of the MC plan with the PB plan was performed. Our results demonstrate that MC is able to accurately predict the dose in the presence of inhomogeneities typical for head and neck and thorax regions with reasonable calculation times (5-20 min). Lateral electron transport was well reproduced in MC calculations. We are planning to implement MC calculations for head and neck and lung cancer patients.
Locke, C
2008-01-01
Monte Carlo (MC) method provides the most accurate to-date dose calculations in heterogeneous media and complex geometries, and this spawns increasing interest in incorporating MC calculations to treatment planning quality assurance process. This process involves MC dose calculations for the treatment plans produced clinically. To perform these calculations a number of treatment plan parameters specifying radiation beam and patient geometries needs to be transferred to MC codes such as BEAMnrc and DOSXYZnrc. Extracting these parameters from DICOM files is not a trivial task that has previously been performed mostly using Matlab-based software. This paper describes DICOM tags that contain information required for MC modeling of conformal and IMRT plans, and reports development of an in-house DICOM interface through a library (named Vega) of platform-independent, object-oriented C++ codes. Vega library is small and succinct, offering just the fundamental functions for reading/modifying/writing DICOM files in a ...
Verification of Embedded Memory Systems using Efficient Memory Modeling
Ganai, Malay K; Ashar, Pranav
2011-01-01
We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...
Verification of Transformer Restricted Earth Fault Protection by using the Monte Carlo Method
Directory of Open Access Journals (Sweden)
KRSTIVOJEVIC, J. P.
2015-08-01
Full Text Available The results of a comprehensive investigation of the influence of current transformer (CT saturation on restricted earth fault (REF protection during power transformer magnetization inrush are presented. Since the inrush current during switch-on of unloaded power transformer is stochastic, its values are obtained by: (i laboratory measurements and (ii calculations based on the input data obtained by the Monte Carlo (MC simulation. To make a detailed assessment of the current transformer performance the uncertain input data for the CT model were obtained by applying the MC method. In this way, different levels of remanent flux in CT core are taken into consideration. By the generated CT secondary currents, the algorithm for REF protection based on phase comparison in time domain is tested. On the basis of the obtained results, a method of adjustment of the triggering threshold in order to ensure safe operation during transients, and thereby improve the algorithm security, has been proposed. The obtained results indicate that power transformer REF protection would be enhanced by using the proposed adjustment of triggering threshold in the algorithm which is based on phase comparison in time domain.
Energy Technology Data Exchange (ETDEWEB)
Fragoso, Margarida; Wen Ning; Kumar, Sanath; Liu Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J, E-mail: ichetty1@hfhs.or [Henry Ford Health System, Detroit, MI (United States)
2010-08-21
Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m{sub 3} MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within {+-}4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC
Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release
DEFF Research Database (Denmark)
Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan
2015-01-01
In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...
Verification tests for a solar-heating system
1980-01-01
Report describes method of verification of solar space heating and hot-water systems using similarity comparison, mathematical analysis, inspections, and tests. Systems, subsystems, and components were tested for performance, durability, safety, and other factors. Tables and graphs compliment test materials.
Energy Technology Data Exchange (ETDEWEB)
Pavon, Ester Carrasco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Sanchez-Doblado, Francisco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Leal, Antonio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Capote, Roberto [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Lagares, Juan Ignacio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Perucha, Maria [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Arrans, Rafael [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain)
2003-09-07
Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm{sup 2} electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm{sup 2} field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm{sup 2} open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R{sub 50}, R{sub 80}, R{sub 100}, R{sub p}, E{sub 0}) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results
System verification and validation: a fundamental systems engineering task
Ansorge, Wolfgang R.
2004-09-01
Systems Engineering (SE) is the discipline in a project management team, which transfers the user's operational needs and justifications for an Extremely Large Telescope (ELT) -or any other telescope-- into a set of validated required system performance characteristics. Subsequently transferring these validated required system performance characteris-tics into a validated system configuration, and eventually into the assembled, integrated telescope system with verified performance characteristics and provided it with "objective evidence that the particular requirements for the specified intended use are fulfilled". The latter is the ISO Standard 8402 definition for "Validation". This presentation describes the verification and validation processes of an ELT Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50 %. The intention of this article is, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project by explaining them the importance of Systems Engineering in the AIV and validation processes.
Towards Verification of Constituent Systems through Automated Proof
DEFF Research Database (Denmark)
Couto, Luis Diogo Monteiro Duarte; Foster, Simon; Payne, R
2014-01-01
to specify contractual obligations on the constituent systems of a SoS. To support verification of these obligations we have developed a proof obligation generator and theorem prover plugin for Symphony. The latter uses the Isabelle/HOL theorem prover to automatically discharge the proof obligations arising...
Practical mask inspection system with printability and pattern priority verification
Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka
2011-05-01
Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.
Pixel Based Off-line Signature Verification System
Directory of Open Access Journals (Sweden)
Anik Barua
2015-01-01
Full Text Available The verification of handwritten signatures is one of the oldest and the most popular authentication methods all around the world. As technology improved, different ways of comparing and analyzing signatures become more and more sophisticated. Since the early seventies, people have been exploring how computers can fully take over the task of signature verification and tried different methods. However, none of them is satisfactory enough and time consuming too. Therefore, our proposed pixel based offline signature verification system is one of the fastest and easiest ways to authenticate any handwritten signature we have ever found. For signature acquisition, we have used scanner. Then we have divided the signature image into 2D array and calculated the hexadecimal RGB value of each pixel. After that, we have calculated the total percentage of matching. If the percentage of matching is more than 90, the signature is considered as valid otherwise invalid. We have experimented on more than 35 signatures and the result of our experiment is quite impressive. We have made the whole system web based so that the signature can be verified from anywhere. The average execution time for signature verification is only 0.00003545 second only.
Verification and validation guidelines for high integrity systems. Volume 1
Energy Technology Data Exchange (ETDEWEB)
Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)
1995-03-01
High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.
Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor, Rev. 1.0
Energy Technology Data Exchange (ETDEWEB)
Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun
2005-12-15
Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.
Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor
Energy Technology Data Exchange (ETDEWEB)
Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun
2005-07-15
Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
Applications of quantum Monte Carlo methods in condensed systems
Kolorenc, Jindrich
2010-01-01
The quantum Monte Carlo methods represent a powerful and broadly applicable computational tool for finding very accurate solutions of the stationary Schroedinger equation for atoms, molecules, solids and a variety of model systems. The algorithms are intrinsically parallel and are able to take full advantage of the present-day high-performance computing systems. This review article concentrates on the fixed-node/fixed-phase diffusion Monte Carlo method with emphasis on its applications to electronic structure of solids and other extended many-particle systems.
Compositional Verification of Multi-Station Interlocking Systems
DEFF Research Database (Denmark)
Macedo, Hugo Daniel dos Santos; Fantechi, Alessandro; Haxthausen, Anne Elisabeth
2016-01-01
pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...
Orion GN&C Fault Management System Verification: Scope And Methodology
Brown, Denise; Weiler, David; Flanary, Ronald
2016-01-01
In order to ensure long-term ability to meet mission goals and to provide for the safety of the public, ground personnel, and any crew members, nearly all spacecraft include a fault management (FM) system. For a manned vehicle such as Orion, the safety of the crew is of paramount importance. The goal of the Orion Guidance, Navigation and Control (GN&C) fault management system is to detect, isolate, and respond to faults before they can result in harm to the human crew or loss of the spacecraft. Verification of fault management/fault protection capability is challenging due to the large number of possible faults in a complex spacecraft, the inherent unpredictability of faults, the complexity of interactions among the various spacecraft components, and the inability to easily quantify human reactions to failure scenarios. The Orion GN&C Fault Detection, Isolation, and Recovery (FDIR) team has developed a methodology for bounding the scope of FM system verification while ensuring sufficient coverage of the failure space and providing high confidence that the fault management system meets all safety requirements. The methodology utilizes a swarm search algorithm to identify failure cases that can result in catastrophic loss of the crew or the vehicle and rare event sequential Monte Carlo to verify safety and FDIR performance requirements.
Description and verification of switched control systems
Institute of Scientific and Technical Information of China (English)
贺风华; 姚郁; 赵霞; 张猛
2003-01-01
A modeling framework has been constructed using the theory of hybrid control systems for the switched control systems (SCS) and it can be more effectively used to describe the behavior of the systems and to more easily realize the simulation of the closed loop SCS under the MATLAB environment. On the other hand, a hybrid automaton model is established to analyze and verify the switched control systems. The proposed method is illustrated by an example of switched inverted pendulum control system.
Meaningful timescales from Monte Carlo simulations of molecular systems
Costa, Liborio I
2016-01-01
A new Markov Chain Monte Carlo method for simulating the dynamics of molecular systems with atomistic detail is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.
Efficiency of Monte Carlo sampling in chaotic systems.
Leitão, Jorge C; Lopes, J M Viana Parente; Altmann, Eduardo G
2014-11-01
In this paper we investigate how the complexity of chaotic phase spaces affect the efficiency of importance sampling Monte Carlo simulations. We focus on flat-histogram simulations of the distribution of finite-time Lyapunov exponent in a simple chaotic system and obtain analytically that the computational effort: (i) scales polynomially with the finite time, a tremendous improvement over the exponential scaling obtained in uniform sampling simulations; and (ii) the polynomial scaling is suboptimal, a phenomenon known as critical slowing down. We show that critical slowing down appears because of the limited possibilities to issue a local proposal in the Monte Carlo procedure when it is applied to chaotic systems. These results show how generic properties of chaotic systems limit the efficiency of Monte Carlo simulations.
Base isolation system and verificational experiment of base isolated building
Energy Technology Data Exchange (ETDEWEB)
Takeuchi, Mikio; Harada, Osamu; Aoyagi, Sakae; Matsuda, Taiji
1987-05-15
With the objective of rationalization of the earthquake resistant design and the economical design based thereupon, many base isolation systems have been proposed and its research, development and application have been made in recent years. In order to disseminate the system, it is necessary to accumulate the data obtained from vibration tests and earthquake observations and verify the reliability of the system. From this viewpoint, the Central Research Institute of Electric power Industry and Okumura Corporation did the following experiments with a base isolated building as the object: 1) static power application experiments, 2) shaking experiments, 3) free vibration experiments, 4) regular slight vibration observations and 5) earthquake response observations (continuing). This article reports the outline of the base isolation system and the base isolated building concerned as well as the results of the verification experiments 1) through 3) above. From the results of these verification experiments, the basic vibration characteristics of the base isolation system consisting of laminated rubber and plastic damper were revealed and its functions were able to be verified. Especially during the free vibration experiments, the initial displacement even up to a maximum of 10cm was applied to the portion between the foundation and the structure and this displacement corresponds to the responded amplitude in case of the earthquake of seismic intensity of the 6th degree. It is planned to continue the verification further. (18 figs, 3 tabs, 3 photos, 6 refs)
Determining MTF of digital detector system with Monte Carlo simulation
Jeong, Eun Seon; Lee, Hyung Won; Nam, Sang Hee
2005-04-01
We have designed a detector based on a-Se(amorphous Selenium) and done simulation the detector with Monte Carlo method. We will apply the cascaded linear system theory to determine the MTF for whole detector system. For direct comparison with experiment, we have simulated 139um pixel pitch and used simulated X-ray tube spectrum.
Multi-microcomputer system for Monte-Carlo calculations
Berg, B; Krasemann, H
1981-01-01
The authors propose a microcomputer system that allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 6800 microprocessor. One attraction of this processor is that it allows up to 16 M Byte random access memory.
Automatic and Hierarchical Verification for Concurrent Systems
Institute of Scientific and Technical Information of China (English)
赵旭东; 冯玉琳
1990-01-01
Proving correctness of concurrent systems is quite difficult because of the high level of nondeterminism,especially in large and complex ones.AMC is a model checking system for verifying asynchronous concurrent systems by using branching time temporal logic.This paper introduces the techniques of the modelling approach,especially how to construct models for large concurrent systems with the concept of hierarchy,which has been proved to be effective and practical in verifying large systems without a large growth of cost.
Simulation of Cone Beam CT System Based on Monte Carlo Method
Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing
2014-01-01
Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.
Applicability of Quasi-Monte Carlo for lattice systems
Ammon, Andreas; Jansen, Karl; Leovey, Hernan; Griewank, Andreas; Müller-Preussker, Micheal
2013-01-01
This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like $N^{-1/2}$, where $N$ is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to $N^{-1}$, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.
Specification and Verification of Secure Concurrent and Distributed Software Systems
1992-02-01
reduce in SERUPNOZ : *%tsIde(v(v(p(ep(p(0)))) rewite: 23 result Zero: 0 rede In SOUNDOU : outside (op(v(V(r(p(0)))) rewrites: 19 result 2254t: I...Transactions On Programming Languages And Systems, 6(2):159-174, 1984. 252 [BW821 M. Broy and M. Wirsing. Partial abstract types. Acta Informatica , 18, 1982...Acta Informatica , 24, 1987. [DE82] R. Dannenberg and G. Ernst. Formal program verification using symbolic execution. IEEE Transactions on Software
Directory of Open Access Journals (Sweden)
Luis Vazquez Quino
2015-09-01
Full Text Available Purpose: With intensity modulated radiation therapy (IMRT, the physician can prescribe, design and deliver optimized treatment plans that target the tumor and spare adjacent critical structures. The increased conformity of such plans often comes at the expenses of adding significant complexity to the delivery of the treatment. With volumetrically modulated arc therapy (VMAT, in addition to the modulation of the intensity of the radiation beam, other mechanical parameters such as gantry speed and dose rate are varied during treatment delivery. It is therefore imperative that we develop comprehensive and accurate methods to validate such complex delivery techniques prior to the commencement of the patient’s treatment. Methods: In this study, a Monte Carlo simulation was performed for the high definition multileaf collimator (HD-MLC of a Varian Novalis TX linac. Our simulation is based on the MCSIM code and provides a comprehensive model of the linac head. After validating the model in reference geometries, treatment plans for different anatomical sites were simulated and compared against the treatment planning system (TPS dose calculations. All simulations were performed in a cylindrical water phantom as opposed to the patient anatomy, to remove any complexities associated with density effects. Finally, a comparison through gamma analysis of dose plane between the simulation, the TPS and the measurements from the Matrixx array (IBA was conducted to verify the accuracy of our model against both the measurements and the TPS. Results: Gamma analysis of ten IMRT and ten VMAT cases for different anatomical sites was performed, using a 3%/3 mm passing criterion. The average passing rates were 97.5% and 94.3% for the IMRT and the VMAT plans respectively when comparing the MCSIM and TPS dose calculations. Conclusion: In the present work a Monte Carlo model of a Novalis TX linac which has been tested and benchmarked to produce phase-space files for the
Implementation of Monte Carlo Simulations for the Gamma Knife System
Energy Technology Data Exchange (ETDEWEB)
Xiong, W [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Huang, D [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Lee, L [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Feng, J [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Morris, K [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Calugaru, E [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Burman, C [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Li, J [Fox Chase Cancer Center, 333 Cottman Ave., Philadelphia, PA 17111 (United States); Ma, C-M [Fox Chase Cancer Center, 333 Cottman Ave., Philadelphia, PA 17111 (United States)
2007-06-15
Currently the Gamma Knife system is accompanied with a treatment planning system, Leksell GammaPlan (LGP) which is a standard, computer-based treatment planning system for Gamma Knife radiosurgery. In LGP, the dose calculation algorithm does not consider the scatter dose contributions and the inhomogeneity effect due to the skull and air cavities. To improve the dose calculation accuracy, Monte Carlo simulations have been implemented for the Gamma Knife planning system. In this work, the 201 Cobalt-60 sources in the Gamma Knife unit are considered to have the same activity. Each Cobalt-60 source is contained in a cylindric stainless steel capsule. The particle phase space information is stored in four beam data files, which are collected in the inner sides of the 4 treatment helmets, after the Cobalt beam passes through the stationary and helmet collimators. Patient geometries are rebuilt from patient CT data. Twenty two Patients are included in the Monte Carlo simulation for this study. The dose is calculated using Monte Carlo in both homogenous and inhomogeneous geometries with identical beam parameters. To investigate the attenuation effect of the skull bone the dose in a 16cm diameter spherical QA phantom is measured with and without a 1.5mm Lead-covering and also simulated using Monte Carlo. The dose ratios with and without the 1.5mm Lead-covering are 89.8% based on measurements and 89.2% according to Monte Carlo for a 18mm-collimator Helmet. For patient geometries, the Monte Carlo results show that although the relative isodose lines remain almost the same with and without inhomogeneity corrections, the difference in the absolute dose is clinically significant. The average inhomogeneity correction is (3.9 {+-} 0.90) % for the 22 patients investigated. These results suggest that the inhomogeneity effect should be considered in the dose calculation for Gamma Knife treatment planning.
Functional verification of dynamically reconfigurable FPGA-based systems
Gong, Lingkan
2015-01-01
This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric. Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Verification test report on a solar heating and hot water system
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
Formal Verification of Quasi-Synchronous Systems
2015-07-01
was set. For this reason , guards in a hierarchical state machine should not depend on actions performed by their parent states during the same...Satisfiability Modulo Theories (SMT) based model checkers and model checkers for timed automata to provide system engineers with immediate feedback on the...Tool Environment (OSATE). System properties can then be verified using either the Assume Guarantee Reasoning Environment (AGREE) with the Kind model
Airworthiness Compliance Verification Method Based on Simulation of Complex System
Institute of Scientific and Technical Information of China (English)
XU Haojun; LIU Dongliang; XUE Yuan; ZHOU Li; MIN Guilong
2012-01-01
A study is conducted on a new airworthiness compliance verification method based on pilot-aircraft-environment complex system simulation.Verification scenarios are established by “block diagram” method based on airworthiness criteria..A pilot-aircraft-environment complex model is set up and a virtual flight testing method based on connection of MATLAB/Simulink and Flightgear is proposed.Special researches are conducted on the modeling of pilot manipulation stochastic parameters and manipulation in critical situation.Unfavorable flight factors of certain scenario are analyzed,and reliability modeling of important system is researched.A distribution function of small probability event and the theory on risk probability measurement are studied.Nonlinear function is used to depict the relationship between the cumulative probability and the extremum of the critical parameter.A synthetic evaluation model is set up,modified genetic algorithm (MGA) is applied to ascertaining the distribution parameter in the model,and amore reasonable result is obtained.A clause about vehicle control functions (VCFs) verification in MIL-HDBK-516B is selected as an example to validate the practicability of the method.
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Effective quantum Monte Carlo algorithm for modeling strongly correlated systems
Kashurnikov, V. A.; Krasavin, A. V.
2007-01-01
A new effective Monte Carlo algorithm based on principles of continuous time is presented. It allows calculating, in an arbitrary discrete basis, thermodynamic quantities and linear response of mixed boson-fermion, spin-boson, and other strongly correlated systems which admit no analytic description
Verification of Mixed-Signal Systems with Affine Arithmetic Assertions
Directory of Open Access Journals (Sweden)
Carna Radojicic
2013-01-01
Full Text Available Embedded systems include an increasing share of analog/mixed-signal components that are tightly interwoven with functionality of digital HW/SW systems. A challenge for verification is that even small deviations in analog components can lead to significant changes in system properties. In this paper we propose the combination of range-based, semisymbolic simulation with assertion checking. We show that this approach combines advantages, but as well some limitations, of multirun simulations with formal techniques. The efficiency of the proposed method is demonstrated by several examples.
On the Symbolic Verification of Timed Systems
DEFF Research Database (Denmark)
Moeller, Jesper; Lichtenberg, Jacob; Andersen, Henrik Reif
1999-01-01
This paper describes how to analyze a timed system symbolically. That is, given a symbolic representation of a set of (timed) states (as an expression), we describe how to determine an expression that represents the set of states that can be reached either by firing a discrete transition...... or by advancing time. These operations are used to determine the set of reachable states symbolically. We also show how to symbolically determine the set of states that can reach a given set of states (i.e., a backwards step), thus making it possible to verify TCTL-formulae symbolically. The analysis is fully...... symbolic in the sense that both the discrete and the continuous part of the state space are represented symbolically. Furthermore, both the synchronous and asynchronous concurrent composition of timed systems can be performed symbolically. The symbolic representations are given as formulae expressed...
Internet-based dimensional verification system for reverse engineering processes
Energy Technology Data Exchange (ETDEWEB)
Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)
2008-07-15
This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies
An Integrated Design and Verification Methodology for Reconfigurable Multimedia Systems
Borgatti, M; Rossi, U; Lambert, J -L; Moussa, I; Fummi, F; Pravadelli, G
2011-01-01
Recently a lot of multimedia applications are emerging on portable appliances. They require both the flexibility of upgradeable devices (traditionally software based) and a powerful computing engine (typically hardware). In this context, programmable HW and dynamic reconfiguration allow novel approaches to the migration of algorithms from SW to HW. Thus, in the frame of the Symbad project, we propose an industrial design flow for reconfigurable SoC's. The goal of Symbad consists of developing a system level design platform for hardware and software SoC systems including formal and semi-formal verification techniques.
Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release
DEFF Research Database (Denmark)
Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan
2014-01-01
In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....
Formal Modeling and Verification of Interlocking Systems Featuring Sequential Release
DEFF Research Database (Denmark)
Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan
2015-01-01
In this paper, we present a method and an associated tool suite for formal verification of the new ETCS level 2 based Danish railway interlocking systems. We have made a generic and reconfigurable model of the system behavior and generic high-level safety properties. This model accommodates seque...... SMT based bounded model checking (BMC) and inductive reasoning, we are able to verify the properties for model instances corresponding to railway networks of industrial size. Experiments also show that BMC is efficient for finding bugs in the railway interlocking designs....
LHC Beam Loss Monitoring System Verification Applications
Dehning, B; Zamantzas, C; Jackson, S
2011-01-01
The LHC Beam Loss Monitoring (BLM) system is one of the most complex instrumentation systems deployed in the LHC. In addition to protecting the collider, the system also needs to provide a means of diagnosing machine faults and deliver a feedback of losses to the control room as well as to several systems for their setup and analysis. It has to transmit and process signals from almost 4’000 monitors, and has nearly 3 million configurable parameters. The system was designed with reliability and availability in mind. The specified operation and the fail-safety standards must be guaranteed for the system to perform its function in preventing superconductive magnet destruction caused by particle flux. Maintaining the expected reliability requires extensive testing and verification. In this paper we report our most recent addit...
Rule Systems for Runtime Verification: A Short Tutorial
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
Source Code Verification for Embedded Systems using Prolog
Directory of Open Access Journals (Sweden)
Frank Flederer
2017-01-01
Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.
Energy Technology Data Exchange (ETDEWEB)
Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
1998-12-31
In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)
Image-based fingerprint verification system using LabVIEW
Directory of Open Access Journals (Sweden)
Sunil K. Singla
2008-09-01
Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.
Developing a Verification and Training Phantom for Gynecological Brachytherapy System
Directory of Open Access Journals (Sweden)
Mahbobeh Nazarnejad
2012-03-01
Full Text Available Introduction Dosimetric accuracy is a major issue in the quality assurance (QA program for treatment planning systems (TPS. An important contribution to this process has been a proper dosimetry method to guarantee the accuracy of delivered dose to the tumor. In brachytherapy (BT of gynecological (Gyn cancer it is usual to insert a combination of tandem and ovoid applicators with a complicated geometry which makes their dosimetry verification difficult and important. Therefore, evaluation and verification of dose distribution is necessary for accurate dose delivery to the patients. Materials and Methods The solid phantom was made from Perspex slabs as a tool for intracavitary brachytherapy dosimetric QA. Film dosimetry (EDR2 was done for a combination of ovoid and tandem applicators introduced by Flexitron brachytherapy system. Treatment planning was also done with Flexiplan 3D-TPS to irradiate films sandwiched between phantom slabs. Isodose curves obtained from treatment planning system and the films were compared with each other in 2D and 3D manners. Results The brachytherapy solid phantom was constructed with slabs. It was possible to insert tandems and ovoids loaded with radioactive source of Ir-192 subsequently. Relative error was 3-8.6% and average relative error was 5.08% in comparison with the films and TPS isodose curves. Conclusion Our results showed that the difference between TPS and the measurements is well within the acceptable boundaries and below the action level according to AAPM TG.45. Our findings showed that this phantom after minor corrections can be used as a method of choice for inter-comparison analysis of TPS and to fill the existing gap for accurate QA program in intracavitary brachytherapy. The constructed phantom also showed that it can be a valuable tool for verification of accurate dose delivery to the patients as well as training for brachytherapy residents and physics students.
Design for Verification: Using Design Patterns to Build Reliable Systems
Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)
2003-01-01
Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.
Verification and Validation of the Coastal Modeling System. Report 3: CMS-Flow: Hydrodynamics
2011-12-01
ER D C/ CH L TR -1 1- 10 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Co as ta l a nd...11-10 December 2011 Verification and Validation of the Coastal Modeling System Report 3, CMS -Flow: Hydrodynamics Alejandro Sánchez, Weiming Wu...of four reports toward the Verification and Validation (V&V) of the Coastal Modeling System ( CMS ). The details of the V&V study specific to the
Exact Verification of Hybrid Systems Based on Bilinear SOS Representation
Yang, Zhengfeng; Lin, Wang
2012-01-01
In this paper, we address the problem of safety verification of nonlinear hybrid systems and stability analysis of nonlinear autonomous systems. A hybrid symbolic-numeric method is presented to compute exact inequality invariants of hybrid systems and exact estimates of regions of attraction of autonomous systems efficiently. Some numerical invariants of a hybrid system or an estimate of region of attraction can be obtained by solving a bilinear SOS program via PENBMI solver or iterative method, then the modified Newton refinement and rational vector recovery techniques are applied to obtain exact polynomial invariants and estimates of regions of attraction with rational coefficients. Experiments on some benchmarks are given to illustrate the efficiency of our algorithm.
Verification of Opacity and Diagnosability for Pushdown Systems
Directory of Open Access Journals (Sweden)
Koichi Kobayashi
2013-01-01
Full Text Available In control theory of discrete event systems (DESs, one of the challenging topics is the extension of theory of finite-state DESs to that of infinite-state DESs. In this paper, we discuss verification of opacity and diagnosability for infinite-state DESs modeled by pushdown automata (called here pushdown systems. First, we discuss opacity of pushdown systems and prove that opacity of pushdown systems is in general undecidable. In addition, a decidable class is clarified. Next, in diagnosability, we prove that under a certain assumption, which is different from the assumption in the existing result, diagnosability of pushdown systems is decidable. Furthermore, a necessary condition and a sufficient condition using finite-state approximations are derived. Finally, as one of the applications, we consider data integration using XML (Extensible Markup Language. The obtained result is useful for developing control theory of infinite-state DESs.
Verification of Interdomain Routing System Based on Formal Methods
Institute of Scientific and Technical Information of China (English)
ZANG Zhiyuan; LUO Guiming; YIN Chongyuan
2009-01-01
In networks,the stable path problem (SPP) usually results in oscillations in interdomain systems and may cause systems to become unstable.With the rapid development of internet technology,the occurrence of SPPs in interdomain systems has quite recently become a significant focus of research.A framework for checking SPPs is presented in this paper with verification of an interdomain routing system using formal methods and the NuSMV software.Sufficient conditions and necessary conditions for determining SPP occurrence are presented with proof of the method's effectiveness.Linear temporal logic was used to model an interdomain routing system and its properties were analyzed.An example is included to demonstrate the method's reliability.
Formal Development and Verification of Railway Control Systems
DEFF Research Database (Denmark)
Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan
This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...... signalling systems are going to be completely replaced with modern, computer based railway control systems based on the European standard ERTMS/ETCS [3, 4] by the Danish Signaling Programme [1]. The purpose of these systems is to control the railway traffic such that unsafe situations, like train collisions......, are avoided. Central parts of these new systems consist of safety-critical software the functional correctness of which is one of the key requisites for a reliable operation of the traffics and in particular for the safety of passengers. Until now the development of railway control software has typically been...
Verification of Duration Systems Using an Approximation Approach
Institute of Scientific and Technical Information of China (English)
Riadh Robbana
2003-01-01
We consider the verification problem of invariance properties for timed systemsmodeled by (extended) Timed Graphs with duration variables. This problem is in general caseundecidable. Nevertheless we give in this paper a technique extending a given system into anotherone containing the initial computations as well as additional ones. Then we define a digitiza-tion technique allowing the translation from the continuous case to the discrete one. Using thisdigitization, we show that to each real computation in the initial system corresponds a discretecomputation in the extended system. Then, we show that the extended system corresponds to avery close approximation of the initial one, allowing per consequent, a good analysis of invarianceproperties of the initial system.
Fixed-Node Diffusion Monte Carlo of Lithium Systems
Rasch, Kevin
2015-01-01
We study lithium systems over a range of number of atoms, e.g., atomic anion, dimer, metallic cluster, and body-centered cubic crystal by the diffusion Monte Carlo method. The calculations include both core and valence electrons in order to avoid any possible impact by pseudo potentials. The focus of the study is the fixed-node errors, and for that purpose we test several orbital sets in order to provide the most accurate nodal hyper surfaces. We compare our results to other high accuracy calculations wherever available and to experimental results so as to quantify the the fixed-node errors. The results for these Li systems show that fixed-node quantum Monte Carlo achieves remarkably high accuracy total energies and recovers 97-99 % of the correlation energy.
Design and performance verification of a passive propellant management system
Hess, D. A.; Regnier, W. W.
1978-01-01
This paper describes the design and verification testing of a reusable passive propellant management system. The system was designed to acquire propellant in low- or zero-g environments and also retain this propellant under high axially directed accelerations that may be experienced during launch and orbit-to-orbit transfer. The system design requirements were established to satisfy generally the requirements for a large number of potential NASA and military applications, such as orbit-to-orbit shuttles and satellite vehicles. The resulting concept was a multicompartmented tank with independent surface tension acquisition channels in each compartment. The tank was designed to provide a minimum expulsion efficiency of 98 percent when subjected to the simultaneous conditions of acceleration, vibration, and outflow. The system design has the unique capability to demonstrate low-g performance in a 1-g test environment, and the test program summarized was structured around this capability.
Measurability and Safety Verification for Stochastic Hybrid Systems
DEFF Research Database (Denmark)
Fränzle, Martin; Hahn, Ernst Moritz; Hermanns, Holger;
2011-01-01
Dealing with the interplay of randomness and continuous time is important for the formal verification of many real systems. Considering both facets is especially important for wireless sensor networks, distributed control applications, and many other systems of growing importance. An important......-time behaviour is given by differential equations, as for usual hybrid systems, but the targets of discrete jumps are chosen by probability distributions. These distributions may be general measures on state sets. Also non-determinism is supported, and the latter is exploited in an abstraction and evaluation...... method that establishes safe upper bounds on reachability probabilities. To arrive there requires us to solve semantic intricacies as well as practical problems. In particular, we show that measurability of a complete system follows from the measurability of its constituent parts. On the practical side...
Crew Exploration Vehicle (CEV) Potable Water System Verification Description
Peterson, Laurie; DeVera, Jean; Vega, Leticia; Adam, Nik; Steele, John; Gazda, Daniel; Roberts, Michael
2009-01-01
The Crew Exploration Vehicle (CEV), also known as Orion, will ferry a crew of up to six astronauts to the International Space Station (ISS), or a crew of up to four astronauts to the moon. The first launch of CEV is scheduled for approximately 2014. A stored water system on the CEV will supply the crew with potable water for various purposes: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain quality of the water transferred from the Orbiter to the ISS and stored in Contingency Water Containers (CWCs). In the CEV water system, the ionic silver biocide is expected to be depleted from solution due to ionic silver plating onto the surfaces of the materials within the CEV water system, thus negating its effectiveness as a biocide. Since the biocide depletion is expected to occur within a short amount of time after loading the water into the CEV water tanks at the Kennedy Space Center (KSC), an additional microbial control is a 0.1 micron point of use filter that will be used at the outlet of the Potable Water Dispenser (PWD). Because this may be the first time NASA is considering a stored water system for longterm missions that does not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point of use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform testing to help alleviate those concerns related to the CEV water system. Results from the test plans laid out in the paper presented to SAE last year (Crew Exploration Vehicle (CEV) Potable Water System Verification Coordination, 2008012083) will be detailed in this paper. Additionally, recommendations for the CEV verification will be described for risk mitigation in meeting the physicochemical and microbiological requirements on the CEV PWS.
Monte Carlo Simulation for the MAGIC-II System
Carmona, E; Moralejo, A; Vitale, V; Sobczynska, D; Haffke, M; Bigongiari, C; Otte, N; Cabras, G; De Maria, M; De Sabata, F
2007-01-01
Within the year 2007, MAGIC will be upgraded to a two telescope system at La Palma. Its main goal is to improve the sensitivity in the stereoscopic/coincident operational mode. At the same time it will lower the analysis threshold of the currently running single MAGIC telescope. Results from the Monte Carlo simulations of this system will be discussed. A comparison of the two telescope system with the performance of one single telescope will be shown in terms of sensitivity, angular resolution and energy resolution.
Subtle Monte Carlo Updates in Dense Molecular Systems
DEFF Research Database (Denmark)
Bottaro, Sandro; Boomsma, Wouter; Johansson, Kristoffer E.;
2012-01-01
Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce...... as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results...... a kinetic algorithm, CRISP, that greatly enhances the sampling efficiency in all-atom MC simulations of dense systems. The algorithm is based on an exact analytical solution to the classic chain-closure problem, making it possible to express the interdependencies among degrees of freedom in the molecule...
On Sensor Data Verification for Participatory Sensing Systems
Directory of Open Access Journals (Sweden)
Diego Mendez
2013-03-01
Full Text Available In this paper we study the problem of sensor data verification in Participatory Sensing (PS systems using an air quality/pollution monitoring application as a validation example. Data verification, in the context of PS, consists of the process of detecting and removing spatial outliers to properly reconstruct the variables of interest. We propose, implement, and test a hybrid neighborhood-aware algorithm for outlier detection that considers the uneven spatial density of the users, the number of malicious users, the level of conspiracy, and the lack of accuracy and malfunctioning sensors. The algorithm utilizes the Delaunay triangulation and Gaussian Mixture Models to build neighborhoods based on the spatial and non-spatial attributes of each location. This neighborhood definition allows us to demonstrate thatit is not necessary to apply accurate but computationally expensive estimators to the entire dataset to obtain good results, as equally accurate but computationally cheaper methods can also be applied to part of the data and obtain good results as well. Our experimental results show that our hybrid algorithm performs as good as the best estimator while reducing the execution time considerably.
Energy Technology Data Exchange (ETDEWEB)
Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.
2013-07-01
We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)
Chen, Xiangxian; Wang, Dong; Huang, Hai; Wang, Zheng
2014-07-01
Verification and validation of a railway signalling system is a crucial part of the workflow in railway signalling enterprises. Typically, the verification and validation of this type of safety-critical system is performed by means of an on-site test, which leads to a low efficiency and high costs. A novel method for the verification and validation of a railway signalling system is proposed as an application of the enterprise information system (EIS) technique. In this application, the EIS and the simulation test platform are combined together, which enhances the coherence and consistency of the information exchange between the system development and the system verification, to improve the work efficiency. The simulation and auto-test technology used in the system verification also lowers the human and financial costs.
Automatic Verification of Railway Interlocking Systems: A Case Study
DEFF Research Database (Denmark)
Petersen, Jakob Lyng
1998-01-01
This paper presents experiences in applying formal verification to a large industrial piece of software. The are of application is railway interlocking systems. We try to prove requirements of the program controlling the Swedish railway Station Alingsås by using the decision procedure which...... is based on the Stålmarck algorithm. While some requirements are easily proved, others are virtually impossible to manage du to a very large potenbtial state space. We present what has been done in order to get, at least, an idea of whether or not such difficult requirements are fulfilled or not, and we...... express thoughts on what is needed in order to be able to successfully verify large real-life systems....
FAST CONVERGENT MONTE CARLO RECEIVER FOR OFDM SYSTEMS
Institute of Scientific and Technical Information of China (English)
Wu Lili; Liao Guisheng; Bao Zheng; Shang Yong
2005-01-01
The paper investigates the problem of the design of an optimal Orthogonal Frequency Division Multiplexing (OFDM) receiver against unknown frequency selective fading. A fast convergent Monte Carlo receiver is proposed. In the proposed method, the Markov Chain Monte Carlo (MCMC) methods are employed for the blind Bayesian detection without channel estimation. Meanwhile, with the exploitation of the characteristics of OFDM systems, two methods are employed to improve the convergence rate and enhance the efficiency of MCMC algorithms.One is the integration of the posterior distribution function with respect to the associated channel parameters, which is involved in the derivation of the objective distribution function; the other is the intra-symbol differential coding for the elimination of the bimodality problem resulting from the presence of unknown fading channels. Moreover, no matrix inversion is needed with the use of the orthogonality property of OFDM modulation and hence the computational load is significantly reduced. Computer simulation results show the effectiveness of the fast convergent Monte Carlo receiver.
This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...
Bergman, Alanah M; Gete, Ermias; Duzenli, Cheryl; Teke, Tony
2014-05-08
A Monte Carlo (MC) validation of the vendor-supplied Varian TrueBeam 6 MV flattened (6X) phase-space file and the first implementation of the Siebers-Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filter-free (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient-specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open-field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers-Keall MLC model to match the new HD120-MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%-20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans.
Computer aided production planning - SWZ system of order verification
Krenczyk, D.; Skolud, B.
2015-11-01
SWZ (System of order verification) is a computer implementation of the methodology that support fast decision making on the acceptability of a production order, which allows to determine not the best possible solution, but admissible solution that is possible to find in an acceptable time (feasible solution) and acceptable due to the existing constraints. The methodology uses the propagation of constraints techniques and reduced to test a sequence of arbitrarily selected conditions. Fulfilment of all the conditions (the conjunction) provides the ability to perform production orders. In the paper examples of the application of SWZ system comprising the steps of planning and control is presented. The obtained results allowing the determination of acceptable production flow in the system - determination of the manufacturing system parameters those that ensure execution of orders in time under the resource constraints. SWZ also allows to generate the dispatching rules as a sequence of processing operations for each production resource, performed periodically during the production flow in the system. Furthermore the example of SWZ and simulation system integration is shown. SWZ has been enhanced with a module generating files containing the script code of the system model using the internal language of simulation and visualization system.
ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM
The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
A new approach for the verification of optical systems
Siddique, Umair; Aravantinos, Vincent; Tahar, Sofiène
2013-09-01
Optical systems are increasingly used in microsystems, telecommunication, aerospace and laser industry. Due to the complexity and sensitivity of optical systems, their verification poses many challenges to engineers. Traditionally, the analysis of such systems has been carried out by paper-and-pencil based proofs and numerical computations. However, these techniques cannot provide perfectly accurate results due to the risk of human error and inherent approximations of numerical algorithms. In order to overcome these limitations, we propose to use theorem proving (i.e., a computer-based technique that allows to express mathematical expressions and reason about them by taking into account all the details of mathematical reasoning) as an alternative to computational and numerical approaches to improve optical system analysis in a comprehensive framework. In particular, this paper provides a higher-order logic (a language used to express mathematical theories) formalization of ray optics in the HOL Light theorem prover. Based on the multivariate analysis library of HOL Light, we formalize the notion of light ray and optical system (by defining medium interfaces, mirrors, lenses, etc.), i.e., we express these notions mathematically in the software. This allows us to derive general theorems about the behavior of light in such optical systems. In order to demonstrate the practical effectiveness, we present the stability analysis of a Fabry-Perot resonator.
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
24 CFR 5.233 - Mandated use of HUD's Enterprise Income Verification (EIV) System.
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Mandated use of HUD's Enterprise... and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for... § 5.233 Mandated use of HUD's Enterprise Income Verification (EIV) System. (a) Programs subject...
Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System
Parrish, Lewis M.
2009-01-01
NASA Kennedy Space Center (KSC) recently entered into a nonexclusive license agreement with Applied Cryogenic Solutions (ACS), Inc. (Galveston, TX) to commercialize its Gas-Liquid Supersonic Cleaning and Cleaning Verification Spray System technology. This technology, developed by KSC, is a critical component of processes being developed and commercialized by ACS to replace current mechanical and chemical cleaning and descaling methods used by numerous industries. Pilot trials on heat exchanger tubing components have shown that the ACS technology provides for: Superior cleaning in a much shorter period of time. Lower energy and labor requirements for cleaning and de-scaling uper.ninih. Significant reductions in waste volumes by not using water, acidic or basic solutions, organic solvents, or nonvolatile solid abrasives as components in the cleaning process. Improved energy efficiency in post-cleaning heat exchanger operations. The ACS process consists of a spray head containing supersonic converging/diverging nozzles, a source of liquid gas; a novel, proprietary pumping system that permits pumping liquid nitrogen, liquid air, or supercritical carbon dioxide to pressures in the range of 20,000 to 60,000 psi; and various hoses, fittings, valves, and gauges. The size and number of nozzles can be varied so the system can be built in configurations ranging from small hand-held spray heads to large multinozzle cleaners. The system also can be used to verify if a part has been adequately cleaned.
Interacting multiagent systems kinetic equations and Monte Carlo methods
Pareschi, Lorenzo
2014-01-01
The description of emerging collective phenomena and self-organization in systems composed of large numbers of individuals has gained increasing interest from various research communities in biology, ecology, robotics and control theory, as well as sociology and economics. Applied mathematics is concerned with the construction, analysis and interpretation of mathematical models that can shed light on significant problems of the natural sciences as well as our daily lives. To this set of problems belongs the description of the collective behaviours of complex systems composed by a large enough number of individuals. Examples of such systems are interacting agents in a financial market, potential voters during political elections, or groups of animals with a tendency to flock or herd. Among other possible approaches, this book provides a step-by-step introduction to the mathematical modelling based on a mesoscopic description and the construction of efficient simulation algorithms by Monte Carlo methods. The ar...
Secure stand alone positive personnel identity verification system (SSA-PPIV)
Energy Technology Data Exchange (ETDEWEB)
Merillat, P.D.
1979-03-01
The properties of a secure stand-alone positive personnel identity verification system are detailed. The system is designed to operate without the aid of a central computing facility and the verification function is performed in the absence of security personnel. Security is primarily achieved by means of data encryption on a magnetic stripe badge. Several operational configurations are discussed. Advantages and disadvantages of this system compared to a central computer driven system are detailed.
Subtle Monte Carlo Updates in Dense Molecular Systems.
Bottaro, Sandro; Boomsma, Wouter; E Johansson, Kristoffer; Andreetta, Christian; Hamelryck, Thomas; Ferkinghoff-Borg, Jesper
2012-02-14
Although Markov chain Monte Carlo (MC) simulation is a potentially powerful approach for exploring conformational space, it has been unable to compete with molecular dynamics (MD) in the analysis of high density structural states, such as the native state of globular proteins. Here, we introduce a kinetic algorithm, CRISP, that greatly enhances the sampling efficiency in all-atom MC simulations of dense systems. The algorithm is based on an exact analytical solution to the classic chain-closure problem, making it possible to express the interdependencies among degrees of freedom in the molecule as correlations in a multivariate Gaussian distribution. We demonstrate that our method reproduces structural variation in proteins with greater efficiency than current state-of-the-art Monte Carlo methods and has real-time simulation performance on par with molecular dynamics simulations. The presented results suggest our method as a valuable tool in the study of molecules in atomic detail, offering a potential alternative to molecular dynamics for probing long time-scale conformational transitions.
Monte Carlo simulations of systems with complex energy landscapes
Wüst, T.; Landau, D. P.; Gervais, C.; Xu, Y.
2009-04-01
Non-traditional Monte Carlo simulations are a powerful approach to the study of systems with complex energy landscapes. After reviewing several of these specialized algorithms we shall describe the behavior of typical systems including spin glasses, lattice proteins, and models for "real" proteins. In the Edwards-Anderson spin glass it is now possible to produce probability distributions in the canonical ensemble and thermodynamic results of high numerical quality. In the hydrophobic-polar (HP) lattice protein model Wang-Landau sampling with an improved move set (pull-moves) produces results of very high quality. These can be compared with the results of other methods of statistical physics. A more realistic membrane protein model for Glycophorin A is also examined. Wang-Landau sampling allows the study of the dimerization process including an elucidation of the nature of the process.
Energy Technology Data Exchange (ETDEWEB)
Koo, Seo Ryong; Son, Han Seong; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)
1998-12-31
In this work, an automatic software verification method for Nuclear Power Plant (NPP) protection system is developed. This method utilizes Colored Petri Net (CPN) for modeling and Prototype Verification System (PVS) for mathematical verification. In order to help flow-through from modeling by CPN to mathematical proof by PVS, a translator has been developed in this work. The combined method has been applied to a protection system function of Wolsong NPP SDS2(Steam Generator Low Level Trip) and found to be promising for further research and applications. 7 refs., 10 figs. (Author)
Design verification and performance analysis of Serial AXI Links in Broadcom System-on-Chip
Sarai, Simran Kaur
2014-01-01
Design verification is an essential step in the development of any product. Also referred to as qualification testing, design verification ensures that the product as designed is the same as the product as intended. In this project, design verification and performance analysis of Thin Advanced Extensible Interface Links (T-AXI) is conducted on a Broadcom’s SoC (System on Chip). T-AXI is a Broadcom’s proprietary bus that interfaces all the subsystems on the System-onchip (SoC) to the system me...
Verification of Information Flow in Agent-Based Systems
Sabri, Khair Eddin; Khedri, Ridha; Jaskolka, Jason
Analyzing information flow is beneficial for ensuring the satisfiability of security policies during the exchange of information between the agents of a system. In the literature, models such as Bell-LaPadula model and the Chinese Wall model are proposed to capture and govern the exchange of information among agents. Also, we find several verification techniques for analyzing information flow within programs or multi-agent systems. However, these models and techniques assume the atomicity of the exchanged information, which means that the information cannot be decomposed or combined with other pieces of information. Also, the policies of their models prohibit any transfer of information from a high level agent to a low level agent. In this paper, we propose a technique that relaxes these assumptions. Indeed, the proposed technique allows classifying information into frames and articulating finer granularity policies that involve information, its elements, or its frames. Also, it allows for information manipulation through several operations such as focusing and combining information. Relaxing the atomicity of information assumption permits an analysis that takes into account the ability of an agent to link elements of information in order to evolve its knowledge.
Runtime verification of embedded real-time systems.
Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg
We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.
Nizenkov, Paul; Noeding, Peter; Konopka, Martin; Fasoulas, Stefanos
2017-03-01
The in-house direct simulation Monte Carlo solver PICLas, which enables parallel, three-dimensional simulations of rarefied gas flows, is verified and validated. Theoretical aspects of the method and the employed schemes are briefly discussed. Considered cases include simple reservoir simulations and complex re-entry geometries, which were selected from literature and simulated with PICLas. First, the chemistry module is verified using simple numerical and analytical solutions. Second, simulation results of the rarefied gas flow around a 70° blunted-cone, the REX Free-Flyer as well as multiple points of the re-entry trajectory of the Orion capsule are presented in terms of drag and heat flux. A comparison to experimental measurements as well as other numerical results shows an excellent agreement across the different simulation cases. An outlook on future code development and applications is given.
Testing Quantum Devices: Practical Entanglement Verification in Bipartite Optical Systems
Häseler, Hauke; Moroder, Tobias; Lütkenhaus, Norbert
2007-01-01
We present a method to test quantum behavior of quantum information processing devices, such as quantum memories, teleportation devices, channels and quantum key distribution protocols. The test of quantum behavior can be phrased as the verification of effective entanglement. Necessary separability criteria are formulated in terms of a matrix of expectation values in conjunction with the partial transposition map. Our method is designed to reduce the resources for entanglement verification. A...
Energy Technology Data Exchange (ETDEWEB)
Skidmore, M.S., E-mail: mss16@star.le.ac.u [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom); Ambrosi, R.M. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester, LE1 7RH (United Kingdom)
2010-01-01
Characterising a planetary radiation environment is important to: (1) assess the habitability of a planetary body for indigenous life; (2) assess the risks associated with manned exploration missions to a planetary body and (3) predict/interpret the results that remote sensing instrumentation may obtain from a planetary body (e.g. interpret the gamma-ray emissions from a planetary surface produced by radioactive decay or via the interaction of galactic cosmic rays to obtain meaningful estimates of the concentration of certain elements on the surface of a planet). The University of Leicester is developing instrumentation for geophysical applications that include gamma-ray spectroscopy, gamma-ray densitometry and radiometric dating. This paper describes the verification of a Monte-Carlo planetary radiation environment model developed using the MCNPX code. The model is designed to model the radiation environments of Mars and the Moon, but is applicable to other planetary bodies, and will be used to predict the performance of the instrumentation being developed at Leicester. This study demonstrates that the modelled gamma-ray data is in good agreement with gamma-ray data obtained by the gamma-ray spectrometers on 2001 Mars Odyssey and Lunar Prospector, and can be used to accurately model geophysical instrumentation for planetary science applications.
Finger vein verification system based on sparse representation.
Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong
2012-09-01
Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.
Energy Technology Data Exchange (ETDEWEB)
Modenov, A; Bulatov, M; Livke, A; Morkin, A; Razinkov, S; Safronov, S; Elmont, T; Langner, D; MacArthur, D; Mayo, D; Smith, M; Luke, S J
2005-06-10
This report describes the software development for the plutonium attribute verification system--AVNG. A brief synopsis of the technical solution for the measurement system is presented. The main tasks for the software development that is underway are formulated. The development tasks are shown in software structural flowcharts, measurement system state diagram and a description of the software. The current status of the AVNG software development is elucidated.
RRB's SVES Input File - Post Entitlement State Verification and Exchange System (PSSVES)
Social Security Administration — Several PSSVES request files are transmitted to SSA each year for processing in the State Verification and Exchange System (SVES). This is a first step in obtaining...
National Aeronautics and Space Administration — This proposal is for the creation of a system-level software specification and verification tool. This proposal suggests a major leap-forward in usability of...
National Aeronautics and Space Administration — A major challenge of the use of adaptive systems in safety-critical applications is the software life-cycle: requirement engineering through verification and...
Stochastic Verification Theorem of Forward-Backward Controlled System for Viscosity Solutions
Zhang, Liangquan
2010-01-01
In this paper, we investigate the controlled system described by forward-backward stochastic differential equations with the control contained in drift, diffusion and generater of BSDE. A new verification theorem is derived within the framework of viscosity solutions without involving any derivatives of the value functions. It is worth to pointing out that this theorem has wider applicability than the restrictive classical verification theorems. As a relevant problem, the optimal stochastic feedback controls for forward-backward system are discussed as well.
Monte Carlo Alpha Iteration Algorithm for a Subcritical System Analysis
Directory of Open Access Journals (Sweden)
Hyung Jin Shim
2015-01-01
Full Text Available The α-k iteration method which searches the fundamental mode alpha-eigenvalue via iterative updates of the fission source distribution has been successfully used for the Monte Carlo (MC alpha-static calculations of supercritical systems. However, the α-k iteration method for the deep subcritical system analysis suffers from a gigantic number of neutron generations or a huge neutron weight, which leads to an abnormal termination of the MC calculations. In order to stably estimate the prompt neutron decay constant (α of prompt subcritical systems regardless of subcriticality, we propose a new MC alpha-static calculation method named as the α iteration algorithm. The new method is derived by directly applying the power method for the α-mode eigenvalue equation and its calculation stability is achieved by controlling the number of time source neutrons which are generated in proportion to α divided by neutron speed in MC neutron transport simulations. The effectiveness of the α iteration algorithm is demonstrated for two-group homogeneous problems with varying the subcriticality by comparisons with analytic solutions. The applicability of the proposed method is evaluated for an experimental benchmark of the thorium-loaded accelerator-driven system.
Verification of the Space Shuttle entry GN&C system
Van Hoften, J. D. A.; Moyles, J. A.
1981-01-01
The certification procedures for the initial Shuttle flight are discussed. Particular attention is paid to the entry guidance, navigation, and control (GNC) verification, comprising tests, analysis, demonstration, inspection, and simulation. Flow diagrams for the verification and operational flight sequences are provided, along with a block diagram of the GNC circuitry interfaces. The development of the test matrix software for the GNC is outlined, noting the constant interplay between software verification and spacecraft reconfiguration to meet simulated performance requirements. Comparison of GNC performance predictions with actual entry flight data showed a good match in all performance areas except for sideslip excursions, bank overshoots, an area of transonic buffet, and an increased lift/drag ratio in the preflare to landing flight phase.
ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue
Directory of Open Access Journals (Sweden)
Kuo-Kun Tseng
2016-01-01
Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.
The Monte Carlo Simulation Method for System Reliability and Risk Analysis
Zio, Enrico
2013-01-01
Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling. Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques. This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...
2010-01-01
Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the v...
Energy Technology Data Exchange (ETDEWEB)
Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.
A BrachyPhantom for verification of dose calculation of HDR brachytherapy planning system
Energy Technology Data Exchange (ETDEWEB)
Austerlitz, C. [Clinica Diana Campos, Recife, PE 52020-030 (Brazil); Campos, C. A. T. [Pontifícia Universidade Católica do Rio de Janeiro, RJ 22451-900 (Brazil)
2013-11-15
Purpose: To develop a calibration phantom for {sup 192}Ir high dose rate (HDR) brachytherapy units that renders possible the direct measurement of absorbed dose to water and verification of treatment planning system.Methods: A phantom, herein designated BrachyPhantom, consists of a Solid Water™ 8-cm high cylinder with a diameter of 14 cm cavity in its axis that allows the positioning of an A1SL ionization chamber with its reference measuring point at the midheight of the cylinder's axis. Inside the BrachyPhantom, at a 3-cm radial distance from the chamber's reference measuring point, there is a circular channel connected to a cylindrical-guide cavity that allows the insertion of a 6-French flexible plastic catheter from the BrachyPhantom surface. The PENELOPE Monte Carlo code was used to calculate a factor, P{sub sw}{sup lw}, to correct the reading of the ionization chamber to a full scatter condition in liquid water. The verification of dose calculation of a HDR brachytherapy treatment planning system was performed by inserting a catheter with a dummy source in the phantom channel and scanning it with a CT. The CT scan was then transferred to the HDR computer program in which a multiple treatment plan was programmed to deliver a total dose of 150 cGy to the ionization chamber. The instrument reading was then converted to absorbed dose to water using the N{sub gas} formalism and the P{sub sw}{sup lw} factor. Likewise, the absorbed dose to water was calculated using the source strength, S{sub k}, values provided by 15 institutions visited in this work.Results: A value of 1.020 (0.09%, k= 2) was found for P{sub sw}{sup lw}. The expanded uncertainty in the absorbed dose assessed with the BrachyPhantom was found to be 2.12% (k= 1). To an associated S{sub k} of 27.8 cGy m{sup 2} h{sup −1}, the total irradiation time to deliver 150 cGy to the ionization chamber point of reference was 161.0 s. The deviation between the absorbed doses to water assessed with
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.
2016-11-29
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations
Novotny, M.A.
2010-02-01
The efficiency of dynamic Monte Carlo algorithms for off-lattice systems composed of particles is studied for the case of a single impurity particle. The theoretical efficiencies of the rejection-free method and of the Monte Carlo with Absorbing Markov Chains method are given. Simulation results are presented to confirm the theoretical efficiencies. © 2010.
Event-chain Monte Carlo algorithm for continuous spin systems and its application
Nishikawa, Yoshihiko; Hukushima, Koji
2016-09-01
The event-chain Monte Carlo (ECMC) algorithm is described for hard-sphere systems and general potential systems including interacting particle system and continuous spin systems. Using the ECMC algorithm, large-scale equilibrium Monte Carlo simulations are performed for a three-dimensional chiral helimagnetic model under a magnetic field. It is found that critical behavior of a phase transition changes with increasing the magnetic field.
Highly Efficient Monte-Carlo for Estimating the Unavailability of Markov Dynamic System1）
Institute of Scientific and Technical Information of China (English)
XIAOGang; DENGLi; ZHANGBen-Ai; ZHUJian-Shi
2004-01-01
Monte Carlo simulation has become an important tool for estimating the reliability andavailability of dynamic system, since conventional numerical methods are no longer efficient whenthe size of the system to solve is large. However, evaluating by a simulation the probability of oc-currence of very rare events means playing a very large number of histories of the system, whichleads to unacceptable computing time. Highly efficient Monte Carlo should be worked out. In thispaper, based on the integral equation describing state transitions of Markov dynamic system, a u-niform Monte Carlo for estimating unavailability is presented. Using free-flight estimator, directstatistical estimation Monte Carlo is achieved. Using both free-flight estimator and biased proba-bility space of sampling, weighted statistical estimation Monte Carlo is also achieved. Five MonteCarlo schemes, including crude simulation, analog simulation, statistical estimation based oncrude and analog simulation, and weighted statistical estimation, are used for calculating the un-availability of a repairable Con/3/30 : F system. Their efficiencies are compared with each other.The results show the weighted statistical estimation Monte Carlo has the smallest variance and thehighest efficiency in very rare events simulation.
The inverse method parametric verification of real-time embedded systems
André , Etienne
2013-01-01
This book introduces state-of-the-art verification techniques for real-time embedded systems, based on the inverse method for parametric timed automata. It reviews popular formalisms for the specification and verification of timed concurrent systems and, in particular, timed automata as well as several extensions such as timed automata equipped with stopwatches, linear hybrid automata and affine hybrid automata.The inverse method is introduced, and its benefits for guaranteeing robustness in real-time systems are shown. Then, it is shown how an iteration of the inverse method can solv
Digital system verification a combined formal methods and simulation framework
Li, Lun
2010-01-01
Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5
Simulating Strongly Correlated Electron Systems with Hybrid Monte Carlo
Institute of Scientific and Technical Information of China (English)
LIU Chuan
2000-01-01
Using the path integral representation, the Hubbard and the periodic Anderson model on D-dimensional cubic lattice are transformed into field theories of fermions in D + 1 dimensions. These theories at half-filling possess a positive definite real symmetry fermion matrix and can be simulated using the hybrid Monte Carlo method.
A Cache System Design for CMPs with Built-In Coherence Verification
Directory of Open Access Journals (Sweden)
Mamata Dalui
2016-01-01
Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.
Quasi-Monte Carlo methods for lattice systems: a first look
Jansen, K; Nube, A; Griewank, A; Müller-Preussker, M
2013-01-01
We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like 1/Sqrt(N), where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to 1/N. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.
2013-03-26
... capacity; (3) provide the prospective user with information related to the Rules of Behavior for system... URBAN DEVELOPMENT Proposed Information Collection for Public Comment: Enterprise Income Verification (EIV) Systems--Access Authorization Form and Rules of Behavior and User Agreement AGENCY: Office of...
Compositional verification of a multi-agent system for one-to-many negotiation
Brazier, F.M.T.; Cornelissen, F.J.; Gustavsson, R.; Jonker, C.M.; Lindeberg, O.; Polak, B.; Treur, J.
2004-01-01
Verification of multi-agent systems hardly occurs in design practice. One of the difficulties is that required properties for a multi-agent system usually refer to multi-agent behaviour which has nontrivial dynamics. To constrain these multi-agent behavioural dynamics, often a form of organisational
Compositional Design and Verification of a Multi-Agent System for One-to-Many Negotiation
Brazier, F.M.T.; Cornelissen, F.; Gustavsson, R.; Jonker, C.M.; Lindeberg, O.; Polak, B.; Treur, J.
1998-01-01
A compositional verification method for multi-agent systems is presented and applied to a multi-agent system for one-to-many negotiation in the domain of load balancing of electricity use. Advantages of the method are that the complexity of the
An Improved Constraint-based system for the verification of security protocols
Corin, Ricardo; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German
2002-01-01
We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs an
A Formal Approach for the Construction and Verification of Railway Control Systems
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth; Peleska, Jan; Kinder, Sebastian
2011-01-01
This paper describes a complete model-based development and verification approach for railway control systems. For each control system to be generated, the user makes a description of the application-specific parameters in a domain-specific language. This description is automatically transformed ...
Towards a Framework for Modelling and Verification of Relay Interlocking Systems
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth
2010-01-01
This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...
DEFF Research Database (Denmark)
2005-01-01
The aim of the workshop is, to provide a forum for researchers interested in the development of mathematical techniques for the analysis and verification of systems with infinitely many states. Topics: Techniques for modeling and analysis of infinite-state systems; Equivalence-checking and model-...
Srivas, Mandayam; Bickford, Mark
1991-01-01
The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.
Cognitive Bias in the Verification and Validation of Space Flight Systems
Larson, Steve
2012-01-01
Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of
Cell-veto Monte Carlo algorithm for long-range systems
Kapfer, Sebastian C.; Krauth, Werner
2016-09-01
We present a rigorous efficient event-chain Monte Carlo algorithm for long-range interacting particle systems. Using a cell-veto scheme within the factorized Metropolis algorithm, we compute each single-particle move with a fixed number of operations. For slowly decaying potentials such as Coulomb interactions, screening line charges allow us to take into account periodic boundary conditions. We discuss the performance of the cell-veto Monte Carlo algorithm for general inverse-power-law potentials, and illustrate how it provides a new outlook on one of the prominent bottlenecks in large-scale atomistic Monte Carlo simulations.
Directory of Open Access Journals (Sweden)
Gusev Alexander
2015-01-01
Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.
Monte Carlo analysis of a control technique for a tunable white lighting system
DEFF Research Database (Denmark)
Chakrabarti, Maumita; Thorseth, Anders; Jepsen, Jørgen
2017-01-01
A simulated colour control mechanism for a multi-coloured LED lighting system is presented. The system achieves adjustable and stable white light output and allows for system-to-system reproducibility after application of the control mechanism. The control unit works using a pre-calibrated lookup...... table for an experimentally realized system, with a calibrated tristimulus colour sensor. A Monte Carlo simulation is used to examine the system performance concerning the variation of luminous flux and chromaticity of the light output. The inputs to the Monte Carlo simulation, are variations of the LED...... peak wavelength, the LED rated luminous flux bin, the influence of the operating conditions, ambient temperature, driving current, and the spectral response of the colour sensor. The system performance is investigated by evaluating the outputs from the Monte Carlo simulation. The outputs show...
Energy Technology Data Exchange (ETDEWEB)
ERMI, A.M.
2000-09-05
The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (V&V) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification.
Proceedings 11th International Workshop on Automated Specification and Verification of Web Systems
DEFF Research Database (Denmark)
2015-01-01
These proceedings contain the papers presented at the 11th International Workshop on Automated Specification and Verification of Web Systems (WWV 2015), which was held on 23 June 2015 in Oslo, Norway, as a satellite workshop of the 20th International Symposium on Formal Methods (FM 2015). WWV is ...
Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...
Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.
2013-01-01
A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 R
Institute of Scientific and Technical Information of China (English)
Niu Chun-Hui; Zhang Yan; Gu Ben-Yuan
2005-01-01
A new distribution scheme of decryption keys used in optical verification systems is proposed. The encryption procedure is digitally implemented with the use of an iteration algorithm in computer. Three target images corresponding to three wavelengths are encoded into three sets of phase-only masks (POMs) by a special distributing method. These three sets of POMs are assigned to three authorized users as the personal identification. A lensless optical system is used as the verification system. In the verification procedure, every two of the three authorized users can pass the verification procedure cooperatively, but only one user cannot do. Numerical simulation shows that the proposed distribution scheme of decryption keys not only can improve the security level of verification system, but also can bring convenience and flexibility for authorized users.
Ahmad, Munir; Deng, Jun; Lund, Molly W.; Chen, Zhe; Kimmett, James; Moran, Meena S.; Nath, Ravinder
2009-01-01
The goal of this work is to present a systematic Monte Carlo validation study on the clinical implementation of the enhanced dynamic wedges (EDWs) into the Pinnacle3 (Philips Medical Systems, Fitchburg, WI) treatment planning system (TPS) and QA procedures for patient plan verification treated with EDWs. Modeling of EDW beams in the Pinnacle3 TPS, which employs a collapsed-cone convolution superposition (CCCS) dose model, was based on a combination of measured open-beam data and the 'Golden Segmented Treatment Table' (GSTT) provided by Varian for each photon beam energy. To validate EDW models, dose profiles of 6 and 10 MV photon beams from a Clinac 2100 C/D were measured in virtual water at depths from near-surface to 30 cm for a wide range of field sizes and wedge angles using the Profiler 2 (Sun Nuclear Corporation, Melbourne, FL) diode array system. The EDW output factors (EDWOFs) for square fields from 4 to 20 cm wide were measured in virtual water using a small-volume Farmer-type ionization chamber placed at a depth of 10 cm on the central axis. Furthermore, the 6 and 10 MV photon beams emerging from the treatment head of Clinac 2100 C/D were fully modeled and the central-axis depth doses, the off-axis dose profiles and the output factors in water for open and dynamically wedged fields were calculated using the Monte Carlo (MC) package EGS4. Our results have shown that (1) both the central-axis depth doses and the off-axis dose profiles of various EDWs computed with the CCCS dose model and MC simulations showed good agreement with the measurements to within 2%/2 mm; (2) measured EDWOFs used for monitor-unit calculation in Pinnacle3 TPS agreed well with the CCCS and MC predictions within 2%; (3) all the EDW fields satisfied our validation criteria of 1% relative dose difference and 2 mm distance-to-agreement (DTA) with 99-100% passing rate in routine patient treatment plan verification using MapCheck 2D diode array.
Energy Technology Data Exchange (ETDEWEB)
Wang, Z; Thomas, A; Newton, J; Ibbott, G; Deasy, J; Oldham, M, E-mail: Zhiheng.wang@duke.ed
2010-11-01
Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm{sup 3}. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.
Advances in SVM-Based System Using GMM Super Vectors for Text-Independent Speaker Verification
Institute of Scientific and Technical Information of China (English)
ZHAO Jian; DONG Yuan; ZHAO Xianyu; YANG Hao; LU Liang; WANG Haila
2008-01-01
For text-independent speaker verification,the Gaussian mixture model (GMM) using a universal background model strategy and the GMM using support vector machines are the two most commonly used methodologies.Recently,a new SVM-based speaker verification method using GMM super vectors has been proposed.This paper describes the construction of a new speaker verification system and investigates the use of nuisance attribute projection and test normalization to further enhance performance.Experiments were conducted on the core test of the 2006 NIST speaker recognition evaluation corpus.The experimental results indicate that an SVM-based speaker verification system using GMM super vectors can achieve ap-pealing performance.With the use of nuisance attribute projection and test normalization,the system per-formance can be significantly improved,with improvements in the equal error rate from 7.78% to 4.92% and detection cost function from 0.0376 to 0.0251.
Wang, Z.; Thomas, A.; Newton, J.; Ibbott, G.; Deasy, J.; Oldham, M.
2010-11-01
Achieving adequate verification and quality-assurance (QA) for radiosurgery treatment of trigeminal-neuralgia (TGN) is particularly challenging because of the combination of very small fields, very high doses, and complex irradiation geometries (multiple gantry and couch combinations). TGN treatments have extreme requirements for dosimetry tools and QA techniques, to ensure adequate verification. In this work we evaluate the potential of Presage/Optical-CT dosimetry system as a tool for the verification of TGN distributions in high-resolution and in 3D. A TGN treatment was planned and delivered to a Presage 3D dosimeter positioned inside the Radiological-Physics-Center (RPC) head and neck IMRT credentialing phantom. A 6-arc treatment plan was created using the iPlan system, and a maximum dose of 80Gy was delivered with a Varian Trilogy machine. The delivered dose to Presage was determined by optical-CT scanning using the Duke Large field-of-view Optical-CT Scanner (DLOS) in 3D, with isotropic resolution of 0.7mm3. DLOS scanning and reconstruction took about 20minutes. 3D dose comparisons were made with the planning system. Good agreement was observed between the planned and measured 3D dose distributions, and this work provides strong support for the viability of Presage/Optical-CT as a highly useful new approach for verification of this complex technique.
The Environmental Technology Verification report discusses the technology and performance of the IR PowerWorks 70kW Microturbine System manufactured by Ingersoll-Rand Energy Systems. This system is a 70 kW electrical generator that puts out 480 v AC at 60 Hz and that is driven by...
Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...
Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...
Verification of Faulty Message Passing Systems with Continuous State Space in PVS
Pilotto, Concetta; White, Jerome
2010-01-01
We present a library of Prototype Verification System (PVS) meta-theories that verifies a class of distributed systems in which agent commu nication is through message-passing. The theoretic work, outlined in, consists of iterative schemes for solving systems of linear equations , such as message-passing extensions of the Gauss and Gauss-Seidel me thods. We briefly review that work and discuss the challenges in formally verifying it.
Energy Technology Data Exchange (ETDEWEB)
CLARK, D.L.
1999-08-09
The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000.
Synchronous parallel kinetic Monte Carlo Diffusion in Heterogeneous Systems
Energy Technology Data Exchange (ETDEWEB)
Martinez Saez, Enrique [Los Alamos National Laboratory; Hetherly, Jeffery [Los Alamos National Laboratory; Caro, Jose A [Los Alamos National Laboratory
2010-12-06
A new hybrid Molecular Dynamics-kinetic Monte Carlo algorithm has been developed in order to study the basic mechanisms taking place in diffusion in concentrated alloys under the action of chemical and stress fields. Parallel implementation of the k-MC part based on a recently developed synchronous algorithm [1. Compo Phys. 227 (2008) 3804-3823] resorting on the introduction of a set of null events aiming at synchronizing the time for the different subdomains, added to the parallel efficiency of MD, provides the computer power required to evaluate jump rates 'on the flight', incorporating in this way the actual driving force emerging from chemical potential gradients, and the actual environment-dependent jump rates. The time gain has been analyzed and the parallel performance reported. The algorithm is tested on simple diffusion problems to verify its accuracy.
Scenario-based verification of real-time systems using UPPAAL
DEFF Research Database (Denmark)
Li, Shuhao; Belaguer, Sandie; David, Alexandre;
2010-01-01
as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy
Energy Technology Data Exchange (ETDEWEB)
Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)
2012-05-15
Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at
Application of Photon Transport Monte Carlo Module with GPU-based Parallel System
Energy Technology Data Exchange (ETDEWEB)
Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)
2015-05-15
In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.
American Society for Testing and Materials. Philadelphia
2008-01-01
1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...
Meaningful timescales from Monte Carlo simulations of particle systems with hard-core interactions
Costa, Liborio I.
2016-12-01
A new Markov Chain Monte Carlo method for simulating the dynamics of particle systems characterized by hard-core interactions is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.
Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 and Vol 2
Parsons, J E
2000-01-01
The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.
Fluor Hanford Integrated Safety Management System Phase II Verification Vol 1 & Vol 2
Energy Technology Data Exchange (ETDEWEB)
PARSONS, J.E.
2000-07-15
The U.S. Department of Energy (DOE) is committed to conducting work efficiently and in a manner that ensures protection of the workers, public, and environment. DOE policy mandates that safety management systems be used to systematically integrate safety into management and work practices at all levels while accomplishing mission goals in an effective and efficient manner. The purpose of the Fluor Hanford (FH) Integrated Safety Management System (ISMS) verification was to determine whether FH's ISM system and processes are sufficiently implemented to accomplish the goal of ''Do work safely.'' The purpose of the DOE, Richland Operations Office (RL) verification was to determine whether RL has established processes that adequately describe RL's role in safety management and if those processes are sufficiently implemented.
Face verification system for Android mobile devices using histogram based features
Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu
2016-07-01
This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.
Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring
Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.
2015-01-01
Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.
Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border
Rijsdijk, M.; van Hinsbergh, W. H. M.; Witteveen, W.; ten Buuren, G. H. M.; Schakelaar, G. A.; Poppinga, G.; van Persie, M.; Ladiges, R.
2013-08-01
Quite often in the verification of cadastral borders, owners of the parcels involved are not able to make their attendance at the appointed moment in time. New appointments have to be made in order to complete the verification process, and as a result often costs and throughput times grow beyond what is considered to be acceptable. To improve the efficiency of the verification process an experiment was set up that refrains from the conventional terrestrial methods for border verification. The central research question was formulated as "How useful are Unmanned Aerial Systems in the juridical verification process of cadastral borders of ownership at het Kadaster in the Netherlands?". For the experiment, operational evaluations were executed at two different locations. The first operational evaluation took place at the Pyramid of Austerlitz, a flat area with a 30 m high pyramid built by troops of Napoleon, with low civilian attendance. Two subsequent evaluations were situated in a small neighbourhood in the city of Nunspeet, where the cadastral situation recently changed, resulting from twenty new houses that were build. Initially a mini-UAS of the KLPD was used to collect photo datasets with less than 1 cm spatial resolution. In a later stage the commercial service provider Orbit Gis was hired. During the experiment four different software packages were used for processing the photo datasets into accurate geo-referenced ortho-mosaics. In this article more details will be described on the experiments carried out. Attention will be paid to the mini-UAS platforms (AscTec Falcon 8, Microdrone MD-4), the cameras used, the photo collection plan, the usage of ground control markers and the calibration of the camera's. Furthermore the results and experiences of the different used SFM software packages (Visual SFM/Bundler, PhotoScan, PhotoModeler and the Orbit software) will be shared.
Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems
Esmaeil Zadeh Soudjani, S.
2014-01-01
Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality, m
Renewable Energy Certificate (REC) Tracking Systems: Costs & Verification Issues (Presentation)
Energy Technology Data Exchange (ETDEWEB)
Heeter, J.
2013-10-01
This document provides information on REC tracking systems: how they are used in the voluntary REC market, a comparison of REC systems fees and information regarding how they treat environmental attributes.
Formal modeling and verification of fractional order linear systems.
Zhao, Chunna; Shi, Likun; Guan, Yong; Li, Xiaojuan; Shi, Zhiping
2016-05-01
This paper presents a formalization of a fractional order linear system in a higher-order logic (HOL) theorem proving system. Based on the formalization of the Grünwald-Letnikov (GL) definition, we formally specify and verify the linear and superposition properties of fractional order systems. The proof provides a rigor and solid underpinnings for verifying concrete fractional order linear control systems. Our implementation in HOL demonstrates the effectiveness of our approach in practical applications.
Technical and programmatic constraints in dynamic verification of satellite mechanical systems
Stavrinidis, C.; Klein, M.; Brunner, O.; Newerla, A.
1996-01-01
The development and verification of satellite systems covers various programmatic options. In the mechanical systems area, spacecraft test verification options include static, shaker vibration, modal survey, thermoelastic, acoustic, impact and other environmental tests. Development and verification tests influence the provision of satellite hardware, e.g. the structural model, engineering model, flight model, postflight etc., which need to be adopted by projects. In particular, adequate understanding of the satellite dynamic characteristics is essential for flight acceptance by launcher authorities. In general, a satellite shaker vibration test is requested by launcher authorities for expendable launchers. For the latter the launcher/satellite interface is well defined at the launcher clampband/separation device, and the interface is considered conveniently as a single point at the centre of the clampband. Recently the need has been identified to refine the interface idealization in launcher/satellite coupled loads dynamic analysis, particularly in cases where concentrated satellite loads are introduced at the interface, e.g. platform support struts. In the case of shuttle payloads, which are attached directly to the shuttle, shaker vibration at a single interface is not meaningful. Shuttle launcher authorities require identification of the satellite dynamic characteristics, e.g. by modal survey, and structural verification can be demonstrated by analysis, testing or a combination of analysis and testing. In the case of large satellite systems, which cannot be tested due to the limitation of the vibration shaker test facilities, a similar approach can be adapted for expendable launchers. In such an approach the dynamic characteristics of the satellite system will be identified by the modal survey test, and detailed satellite verification/qualification will be accomplished by analysis supported by subsystem and component level tests. Mechanical strength verification
Verification measurements of the Karoo Array timing system: a laser radar based time transfer system
Siebrits, R.; Bauermeister, E.; Gamatham, R.; Adams, G.; Malan, J. A.; Burger, J. P.; Kapp, F.; Gibbon, T.; Kriel, H.; Abbott, T.
2016-02-01
An optical fiber based laser radar time transfer system has been developed for the 64-dish MeerKAT radiointerferometer telescope project to provide accurate atomic time to the receivers of the telescope system. This time transfer system is called the Karoo Array Timing System (KATS). Calibration of the time transfer system is essential to ensure that time is accurately transferred to the digitisers that form part of the receivers. Frequency domain reflectometry via vector network analysers is also used to verify measurements taken using time interval counters. This paper details the progress that is made in the verification measurements of the system in order to ensure that time, accurate to within a few nanoseconds of the Universal Coordinated Time (UTC, is available at the point where radio signals from astronomical sources are received. This capability enables world class transient and timing studies with a compact radio interferometer, which has inherent advantages over large single dish radio-telescopes, in observing the transient sky.
Quasi-Monte Carlo methods for lattice systems. A first look
Energy Technology Data Exchange (ETDEWEB)
Jansen, K. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Leovey, H.; Griewank, A. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Mathematik; Nube, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Mueller-Preussker, M. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik
2013-02-15
We investigate the applicability of Quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N{sup -1/2}, where N is the number of observations. By means of Quasi-Monte Carlo methods it is possible to improve this behavior for certain problems up to N{sup -1}. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling.
Energy Technology Data Exchange (ETDEWEB)
Albright, N; Bergstrom, P M; Daly, T P; Descalle, M; Garrett, D; House, R K; Knapp, D K; May, S; Patterson, R W; Siantar, C L; Verhey, L; Walling, R S; Welczorek, D
1999-07-01
PEREGRINE is a 3D Monte Carlo dose calculation system designed to serve as a dose calculation engine for clinical radiation therapy treatment planning systems. Taking advantage of recent advances in low-cost computer hardware, modern multiprocessor architectures and optimized Monte Carlo transport algorithms, PEREGRINE performs mm-resolution Monte Carlo calculations in times that are reasonable for clinical use. PEREGRINE has been developed to simulate radiation therapy for several source types, including photons, electrons, neutrons and protons, for both teletherapy and brachytherapy. However the work described in this paper is limited to linear accelerator-based megavoltage photon therapy. Here we assess the accuracy, reliability, and added value of 3D Monte Carlo transport for photon therapy treatment planning. Comparisons with clinical measurements in homogeneous and heterogeneous phantoms demonstrate PEREGRINE's accuracy. Studies with variable tissue composition demonstrate the importance of material assignment on the overall dose distribution. Detailed analysis of Monte Carlo results provides new information for radiation research by expanding the set of observables.
Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system
Energy Technology Data Exchange (ETDEWEB)
Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon
2000-06-01
MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.
Verification and Validation of Neural Networks for Aerospace Systems
Mackall, Dale; Nelson, Stacy; Schumann, Johann
2002-01-01
The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.
Automation Comparison Procedure for Verification of Hybrid Systems.
1997-11-01
Lecture Notes in Computer Science vol. 999, Springer-Verlag, (1995). [2] Bowler, 0., Grotzky, J., Nielson, M., Nilson.S., and Van Buren, J...Remmel, J.B., "Feedback Derivations: Near Optimal Controls for Hybrid Systems", to appear in Hybrid Systems III, Springer Lecture Notes in Computer Science . [6...Grossman, R.L., Nerode, A., Ravn, A. and Rischel, H. eds., Hybrid Systems, Lecture Notes in Computer
VERIFICATION OF TORSIONAL OSCILLATING MECHANICAL SYSTEM DYNAMIC CALCULATION RESULTS
2014-01-01
On our department we deal with optimization and tuning of torsional oscillating mechanical systems. When solving these problems we often use results of dynamic calculation. The goal of this article is to compare values obtained by computation and experimentally. For this purpose, a mechanical system built in our laboratory was used. At first, classical HARDY type flexible coupling has been applied into the system, then we used a pneumatic flexible shaft coupling developed by us...
Construction and Verification of a Simple Smooth Chaotic System
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
This article introduces a new chaotic system of three-dimensional quadratic autonomous ordinary differential equations, which can display different attractors with two unstable equilibrium points and four unstable equilibrium points respectively. Dynamical properties of this system are then studied. Furthermore, by applying the undetermined coefficient method, heteroclinic orbit of (S)hil'nikov's type in this system is found and the convergence of the series expansions of this heteroclinic orbit are proved in this article. The (S)hil'nikov's theorem guarantees that this system has Smale horseshoes and the horseshoe chaos.
Energy Technology Data Exchange (ETDEWEB)
Budnikov, D. (Dmitry); Bulatov, M. (Mikhail); Jarikhine, I. (Igor); Lebedev, B. (Boris); Livke, A. (Alexander); Modenov, A.; Morkin, A. (Anton); Razinkov, S. (Sergei); Tsaregorodtsev, D. (Dmitry); Vlokh, A. (Andrey); Yakovleva, S. (Svetlana); Elmont, T. H. (Timothy H.); Langner, D. C. (Diana C.); MacArthur, D. W. (Duncan W.); Mayo, D. R. (Douglas R.); Smith, M. K. (Morag K.); Luke, S. J. (S. John)
2005-01-01
An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency @ 1332 keV 50%) and digital gamma-ray spectrometer DSPEC{sup PLUS}. The neutron multiplicity counter is a three ring counter with 164 {sup 3}He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.
Energy Technology Data Exchange (ETDEWEB)
Budnikov, D; Bulatov, M; Jarikhine, I; Lebedev, B; Livke, A; Modenov, A; Morkin, A; Razinkov, S; Safronov, S; Tsaregorodtsev, D; Vlokh, A; Yakovleva, S; Elmont, T; Langner, D; MacArthur, D; Mayo, D; Smith, M; Luke, S J
2005-05-27
An attribute verification system (AVNG) with information barriers for mass and isotopics measurements has been designed and its fabrication is nearly completed. The AVNG is being built by scientists at the Russian Federal Nuclear Center-VNIIEF, with support of Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). Such a system could be used to verify the presence of several unclassified attributes of classified material with no classified information release. The system is comprised of a neutron multiplicity counter and gamma-spectrometry system based on a high purity germanium gamma detector (nominal relative efficiency {at} 1332 keV 50%) and digital gamma-ray spectrometer DSPEC{sup PLUS}. The neutron multiplicity counter is a three ring counter with 164 {sup 3}He tubes. The system was designed to measure prototype containers 491 mm in diameter and 503 mm high. This paper provides a brief history of the project and documents the progress of this effort with drawings and photographs.
Seismic monitoring: a unified system for research and verifications
Energy Technology Data Exchange (ETDEWEB)
Thigpen, L.
1979-02-06
A system for characterizing either a seismic source or geologic media from observational data was developed. This resulted from an examination of the forward and inverse problems of seismology. The system integrates many seismic monitoring research efforts into a single computational capability. Its main advantage is that it unifies computational and research efforts in seismic monitoring. 173 references, 9 figures, 3 tables.
Verification of Continuous Dynamical Systems by Timed Automata
DEFF Research Database (Denmark)
Sloth, Christoffer; Wisniewski, Rafael
2011-01-01
This paper presents a method for abstracting continuous dynamical systems by timed automata. The abstraction is based on partitioning the state space of a dynamical system using positive invariant sets, which form cells that represent locations of a timed automaton. The abstraction is intended to...
Active Learning of Markov Decision Processes for System Verification
DEFF Research Database (Denmark)
Chen, Yingke; Nielsen, Thomas Dyhre
2012-01-01
of input/output observations. While alleviating the problem of manually constructing a system model, the collection/generation of observed system behaviors can also prove demanding. Consequently we seek to minimize the amount of data required. In this paper we propose an algorithm for learning...
MOVES - A tool for Modeling and Verification of Embedded Systems
DEFF Research Database (Denmark)
Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid;
2007-01-01
We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or speed...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....
Modeling and Verification of Reconfigurable and Energy-Efficient Manufacturing Systems
Directory of Open Access Journals (Sweden)
Jiafeng Zhang
2015-01-01
Full Text Available This paper deals with the formal modeling and verification of reconfigurable and energy-efficient manufacturing systems (REMSs that are considered as reconfigurable discrete event control systems. A REMS not only allows global reconfigurations for switching the system from one configuration to another, but also allows local reconfigurations on components for saving energy when the system is in a particular configuration. In addition, the unreconfigured components of such a system should continue running during any reconfiguration. As a result, during a system reconfiguration, the system may have several possible paths and may fail to meet control requirements if concurrent reconfiguration events and normal events are not controlled. To guarantee the safety and correctness of such complex systems, formal verification is of great importance during a system design stage. This paper extends the formalism reconfigurable timed net condition/event systems (R-TNCESs in order to model all possible dynamic behavior in such systems. After that, the designed system based on extended R-TNCESs is verified with the help of a software tool SESA for functional, temporal, and energy-efficient properties. This paper is illustrated by an automatic assembly system.
Fisher, Marcus S.; Northey, Jeffrey; Stanton, William
2014-01-01
The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.
Implementation and verification of a HELIAS module for the systems code PROCESS
Energy Technology Data Exchange (ETDEWEB)
Warmer, F., E-mail: Felix.Warmer@ipp.mpg.de [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Beidler, C.D.; Dinklage, A.; Egorov, K.; Feng, Y.; Geiger, J. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Kemp, R.; Knight, P. [Culham Centre for Fusion Energy, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Schauer, F.; Turkin, Y. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany); Ward, D. [Culham Centre for Fusion Energy, Abingdon, Oxfordshire OX14 3DB (United Kingdom); Wolf, R.; Xanthopoulos, P. [Max Planck Institute for Plasma Physics, D-17491 Greifswald (Germany)
2015-10-15
Highlights: • The implementation of a HELIAS module in the systems code PROCESS is discussed. • Verification w.r.t. W7-X and its performance predictions yields very good agreement. • The generality of the HELIAS models allows with minor adaption's the modeling of tokamaks. • Verification with respect to a tokamak DEMO test case shows very good agreement. - Abstract: In order to study design points of next-step fusion devices such as DEMO, comprehensive systems codes are commonly employed. The code package PROCESS is such a tool, widely used for tokamak systems studies. In this work, the implementation and verification of a HELIAS module into PROCESS is addressed. These HELIAS models include: a plasma geometry model based on Fourier coefficients, a basic island divertor model, as well as a coil model which combines scaling aspects based on the Helias 5-B reactor design in combination with analytic inductance and field calculations. The models are verified firstly with respect to W7-X. Secondly, the generality of the models is used to represent the tokamak which is compared against the original tokamak PROCESS models using a DEMO design as reference case. Both approaches show very good agreement.
VERIFICATION OF TORSIONAL OSCILLATING MECHANICAL SYSTEM DYNAMIC CALCULATION RESULTS
Directory of Open Access Journals (Sweden)
Peter KAŠŠAY
2014-09-01
Full Text Available On our department we deal with optimization and tuning of torsional oscillating mechanical systems. When solving these problems we often use results of dynamic calculation. The goal of this article is to compare values obtained by computation and experimentally. For this purpose, a mechanical system built in our laboratory was used. At first, classical HARDY type flexible coupling has been applied into the system, then we used a pneumatic flexible shaft coupling developed by us. The main difference of these couplings over conventional flexible couplings is that they can change their dynamic properties during operation, by changing the pressure of the gaseous medium in their flexible elements.
Evaluation of the Deployable Seismic Verification System at the Pinedale Seismic Research Facility
Energy Technology Data Exchange (ETDEWEB)
Carr, D.B.
1993-08-01
The intent of this report is to examine the performance of the Deployable Seismic Verification System (DSVS) developed by the Department of Energy (DOE) through its national laboratories to support monitoring of underground nuclear test treaties. A DSVS was installed at the Pinedale Seismic Research Facility (PSRF) near Boulder, Wyoming during 1991 and 1992. This includes a description of the system and the deployment site. System performance was studied by looking at four areas: system noise, seismic response, state of health (SOH) and operational capabilities.
A Verification and Validation Tool for Diagnostic Systems Project
National Aeronautics and Space Administration — Advanced diagnostic systems have the potential to improve safety, increase availability, and reduce maintenance costs in aerospace vehicle and a variety of other...
Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film
Energy Technology Data Exchange (ETDEWEB)
Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)
2013-12-15
Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.
Crew Exploration Vehicle Potable Water System Verification Description
Tuan, George; Peterson, Laurie J.; Vega, Leticia M.
2010-01-01
A stored water system on the crew exploration vehicle (CEV) will supply the crew with potable water for: drinking and food rehydration, hygiene, medical needs, sublimation, and various contingency situations. The current baseline biocide for the stored water system is ionic silver, similar in composition to the biocide used to maintain the quality of the water, transferred from the orbiter to the International Space Station, stored in contingency water containers. In the CEV water system, a depletion of the ionic silver biocide is expected due to ionic silver-plating onto the surfaces of materials within the CEV water system, thus negating its effectiveness as a biocide. Because this may be the first time NASA is considering a stored water system for long-term missions that do not maintain a residual biocide, a team of experts in materials compatibility, biofilms and point-of-use filters, surface treatment and coatings, and biocides has been created to pinpoint concerns and perform the testing that will help alleviate concerns related to the CEV water system.
Computer program uses Monte Carlo techniques for statistical system performance analysis
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
Memory Efficient Data Structures for Explicit Verification of Timed Systems
DEFF Research Database (Denmark)
Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand
2014-01-01
Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of tim...
Formal Verification of the Danish Railway Interlocking Systems
DEFF Research Database (Denmark)
Vu, Linh Hong; Haxthausen, Anne Elisabeth; Peleska, Jan
2014-01-01
in the new Danish interlocking systems. Instantiating the generic model with interlocking configuration data results in a concrete model and high-level safety properties. Using bounded model checking and inductive reasoning, we are able to verify safety properties for model instances corresponding to railway...
Structural Dynamics Verification of Rotorcraft Comprehensive Analysis System (RCAS)
Energy Technology Data Exchange (ETDEWEB)
Bir, G. S.
2005-02-01
The Rotorcraft Comprehensive Analysis System (RCAS) was acquired and evaluated as part of an ongoing effort by the U.S Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to provide state-of-the-art wind turbine modeling and analysis technology for Government and industry. RCAS is an interdisciplinary tool offering aeroelastic modeling and analysis options not supported by current codes. RCAS was developed during a 4-year joint effort among the U.S. Army's Aeroflightdynamics Directorate, Advanced Rotorcraft Technology Inc., and the helicopter industry. The code draws heavily from its predecessor 2GCHAS (Second Generation Comprehensive Helicopter Analysis System), which required an additional 14 years to develop. Though developed for the rotorcraft industry, its general-purpose features allow it to model or analyze a general dynamic system. Its key feature is a specialized finite element that can model spinning flexible parts. The code, therefore, appears particularly suited for wind turbines whose dynamics is dominated by massive flexible spinning rotors. In addition to the simulation capability of the existing codes, RCAS [1-3] offers a range of unique capabilities, including aeroelastic stability analysis, trim, state-space modeling, operating modes, modal reduction, multi-blade coordinate transformation, periodic-system-specific analysis, choice of aerodynamic models, and a controls design/implementation graphical interface.
First verification of generic fidelity recovery in a dynamical system
Pineda, C; Schäfer, R; Seligman, T H; Pineda, Carlos; Prosen, Tomaz; Schaefer, Rudi; Seligman, Thomas H.
2006-01-01
We study the time evolution of fidelity in a dynamical many body system, namely a kicked Ising model, modified to allow for a time reversal invariance breaking. We find good agreement with the random matrix predictions in the realm of strong perturbations. In particular for the time-reversal symmetry breaking case the predicted revival at Heisenberg time is clearly seen.
External Verification of SCADA System Embedded Controller Firmware
2012-03-01
2 HMI human machine interface . . . . . . . . . . . . . . . . . . . . . . . . . 5 RTU remote terminal unit... ISA International Society for Automation . . . . . . . . . . . . . . . . . . . 18 DOE Department of Energy...interface (HMI). Field devices, such as are either PLCs or remote terminal units ( RTUs ), automate physical system operation by implementing digital control
Design, development and verification of the HIFI Alignment Camera System
Boslooper, E.C.; Zwan, B.A. van der; Kruizinga, B.; Lansbergen, R.
2005-01-01
This paper presents the TNO share of the development of the HIFI Alignment Camera System (HACS), covering the opto-mechanical and thermal design. The HACS is an Optical Ground Support Equipment (OGSE) that is specifically developed to verify proper alignment of different modules of the HIFI instrume
Models and formal verification of multiprocessor system-on-chips
DEFF Research Database (Denmark)
Brekling, Aske Wiid; Hansen, Michael Reichhardt; Madsen, Jan
2008-01-01
In this article we develop a model for applications running on multiprocessor platforms. An application is modelled by task graphs and a multiprocessor system is modelled by a number of processing elements, each capable of executing tasks according to a given scheduling discipline. We present a d...... could verify a smart-phone application consisting of 103 tasks executing on 4 processing elements....
Towards the Verification of Safety-critical Autonomous Systems in Dynamic Environments
Directory of Open Access Journals (Sweden)
Adina Aniculaesei
2016-12-01
Full Text Available There is an increasing necessity to deploy autonomous systems in highly heterogeneous, dynamic environments, e.g. service robots in hospitals or autonomous cars on highways. Due to the uncertainty in these environments, the verification results obtained with respect to the system and environment models at design-time might not be transferable to the system behavior at run time. For autonomous systems operating in dynamic environments, safety of motion and collision avoidance are critical requirements. With regard to these requirements, Macek et al. [6] define the passive safety property, which requires that no collision can occur while the autonomous system is moving. To verify this property, we adopt a two phase process which combines static verification methods, used at design time, with dynamic ones, used at run time. In the design phase, we exploit UPPAAL to formalize the autonomous system and its environment as timed automata and the safety property as TCTL formula and to verify the correctness of these models with respect to this property. For the runtime phase, we build a monitor to check whether the assumptions made at design time are also correct at run time. If the current system observations of the environment do not correspond to the initial system assumptions, the monitor sends feedback to the system and the system enters a passive safe state.
Exact Safety Verification of Hybrid Systems Using Sums-Of-Squares Representation
Lin, Wang; Yang, Zhengfeng; Zeng, Zhenbing
2011-01-01
In this paper we discuss how to generate inductive invariants for safety verification of hybrid systems. A hybrid symbolic-numeric method is presented to compute inequality inductive invariants of the given systems. A numerical invariant of the given system can be obtained by solving a parameterized polynomial optimization problem via sum-of-squares (SOS) relaxation. And a method based on Gauss-Newton refinement and rational vector recovery is deployed to obtain the invariants with rational coefficients, which exactly satisfy the conditions of invariants. Several examples are given to illustrate our algorithm.
Simulated coal gas MCFC power plant system verification. Final report
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-07-30
The objective of the main project is to identify the current developmental status of MCFC systems and address those technical issues that need to be resolved to move the technology from its current status to the demonstration stage in the shortest possible time. The specific objectives are separated into five major tasks as follows: Stack research; Power plant development; Test facilities development; Manufacturing facilities development; and Commercialization. This Final Report discusses the M-C power Corporation effort which is part of a general program for the development of commercial MCFC systems. This final report covers the entire subject of the Unocal 250-cell stack. Certain project activities have been funded by organizations other than DOE and are included in this report to provide a comprehensive overview of the work accomplished.
A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment
Energy Technology Data Exchange (ETDEWEB)
Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)
2013-09-15
Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.
Energy Technology Data Exchange (ETDEWEB)
Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.
Directory of Open Access Journals (Sweden)
He Deyu
2016-09-01
Full Text Available Assessing the risks of steering system faults in underwater vehicles is a human-machine-environment (HME systematic safety field that studies faults in the steering system itself, the driver’s human reliability (HR and various environmental conditions. This paper proposed a fault risk assessment method for an underwater vehicle steering system based on virtual prototyping and Monte Carlo simulation. A virtual steering system prototype was established and validated to rectify a lack of historic fault data. Fault injection and simulation were conducted to acquire fault simulation data. A Monte Carlo simulation was adopted that integrated randomness due to the human operator and environment. Randomness and uncertainty of the human, machine and environment were integrated in the method to obtain a probabilistic risk indicator. To verify the proposed method, a case of stuck rudder fault (SRF risk assessment was studied. This method may provide a novel solution for fault risk assessment of a vehicle or other general HME system.
Energy Technology Data Exchange (ETDEWEB)
Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.
2013-07-01
It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)
A system for deduction-based formal verification of workflow-oriented software models
Directory of Open Access Journals (Sweden)
Klimek Radosław
2014-12-01
Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.
Verification of ARES transport code system with TAKEDA benchmarks
Zhang, Liang; Zhang, Bin; Zhang, Penghe; Chen, Mengteng; Zhao, Jingchang; Zhang, Shun; Chen, Yixue
2015-10-01
Neutron transport modeling and simulation are central to many areas of nuclear technology, including reactor core analysis, radiation shielding and radiation detection. In this paper the series of TAKEDA benchmarks are modeled to verify the critical calculation capability of ARES, a discrete ordinates neutral particle transport code system. SALOME platform is coupled with ARES to provide geometry modeling and mesh generation function. The Koch-Baker-Alcouffe parallel sweep algorithm is applied to accelerate the traditional transport calculation process. The results show that the eigenvalues calculated by ARES are in excellent agreement with the reference values presented in NEACRP-L-330, with a difference less than 30 pcm except for the first case of model 3. Additionally, ARES provides accurate fluxes distribution compared to reference values, with a deviation less than 2% for region-averaged fluxes in all cases. All of these confirms the feasibility of ARES-SALOME coupling and demonstrate that ARES has a good performance in critical calculation.
The CMS Monte Carlo Production System: Development and Design
Energy Technology Data Exchange (ETDEWEB)
Evans, D. [Fermi National Accelerator Laboratory, Batavia, IL (United States)], E-mail: evansde@fnal.gov; Fanfani, A. [Universita degli Studi di Bologna and INFN Sezione di Bologna, Bologna (Italy); Kavka, C. [INFN Sezione di Trieste, Trieste (Italy); Lingen, F. van [California Institute of Technology, Pasadena, CA (United States); Eulisse, G. [Northeastern University, Boston, MA (United States); Bacchi, W.; Codispoti, G. [Universita degli Studi di Bologna and INFN Sezione di Bologna, Bologna (Italy); Mason, D. [Fermi National Accelerator Laboratory, Batavia, IL (United States); De Filippis, N. [INFN Sezione di Bari, Bari (Italy); Hernandez, J.M. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Madrid (Spain); Elmer, P. [Princeton University, Princeton, NJ (United States)
2008-03-15
The CMS production system has undergone a major architectural upgrade from its predecessor, with the goal of reducing the operational manpower needed and preparing for the large scale production required by the CMS physics plan. The new production system is a tiered architecture that facilitates robust and distributed production request processing and takes advantage of the multiple Grid and farm resources available to the CMS experiment.
The CMS Monte Carlo Production System Development and Design
Evans, D; Kavka, C; Van Lingen, F; Eulisse, G; Bacchi, W; Codispoti, G; Mason, D; De Filippis, N; Hernandez J M; Elmer, P
2008-01-01
The CMS production system has undergone a major architectural upgrade from its predecessor, with the goal of reducing the operational manpower needed and preparing for the large scale production required by the CMS physics plan. The new production system is a tiered architecture that facilitates robust and distributed production request processing and takes advantage of the multiple Grid and farm resources available to the CMS experiment.
Fuzzy Controllers for a Gantry Crane System with Experimental Verifications
Directory of Open Access Journals (Sweden)
Naif B. Almutairi
2016-01-01
Full Text Available The control problem of gantry cranes has attracted the attention of many researchers because of the various applications of these cranes in the industry. In this paper we propose two fuzzy controllers to control the position of the cart of a gantry crane while suppressing the swing angle of the payload. Firstly, we propose a dual PD fuzzy controller where the parameters of each PD controller change as the cart moves toward its desired position, while maintaining a small swing angle of the payload. This controller uses two fuzzy subsystems. Then, we propose a fuzzy controller which is based on heuristics. The rules of this controller are obtained taking into account the knowledge of an experienced crane operator. This controller is unique in that it uses only one fuzzy system to achieve the control objective. The validity of the designed controllers is tested through extensive MATLAB simulations as well as experimental results on a laboratory gantry crane apparatus. The simulation results as well as the experimental results indicate that the proposed fuzzy controllers work well. Moreover, the simulation and the experimental results demonstrate the robustness of the proposed control schemes against output disturbances as well as against uncertainty in some of the parameters of the crane.
2011-08-01
Modes and Effects Analysis ( FMEA ) and it ensures that only the most robust designs are promoted to the final step. 3. Next, we use a set of system...performed on the most promising candidate designs. 7. Information obtained from functional verification, performance verification, FMEA , and...modes, much like the FMEA process that is often performed at the end of the design cycle. Using FFA, a multitude of failure scenarios can be quickly
Verification of a probabilistic flood forecasting system for an Alpine Region of northern Italy
Laiolo, P.; Gabellani, S.; Rebora, N.; Rudari, R.; Ferraris, L.; Ratto, S.; Stevenin, H.
2012-04-01
Probabilistic hydrometeorological forecasting chains are increasingly becoming an operational tool used by civil protection centres for issuing flood alerts. One of the most important requests of decision makers is to have reliable systems, for this reason an accurate verification of their predictive performances become essential. The aim of this work is to validate a probabilistic flood forecasting system: Flood-PROOFS. The system works in real time, since 2008, in an alpine Region of northern Italy, Valle d'Aosta. It is used by the Civil Protection regional service to issue warnings and by the local water company to protect its facilities. Flood-PROOFS uses as input Quantitative Precipitation Forecast (QPF) derived from the Italian limited area model meteorological forecast (COSMO-I7) and forecasts issued by regional expert meteorologists. Furthermore the system manages and uses both real time meteorological and satellite data and real time data on the maneuvers performed by the water company on dams and river devices. The main outputs produced by the computational chain are deterministic and probabilistic discharge forecasts in different cross sections of the considered river network. The validation of the flood prediction system has been conducted on a 25 months period considering different statistical methods such as Brier score, Rank histograms and verification scores. The results highlight good performances of the system as support system for emitting warnings but there is a lack of statistics especially for huge discharge events.
Verification of hyperbolicity for attractors of some mechanical systems with chaotic dynamics
Kuznetsov, Sergey P.; Kruglov, Vyacheslav P.
2016-03-01
Computer verification of hyperbolicity is provided based on statistical analysis of the angles of intersection of stable and unstable manifolds for mechanical systems with hyperbolic attractors of Smale-Williams type: (i) a particle sliding on a plane under periodic kicks, (ii) interacting particles moving on two alternately rotating disks, and (iii) a string with parametric excitation of standing-wave patterns by a modulated pump. The examples are of interest as contributing to filling the hyperbolic theory of dynamical systems with physical content.
Monte Carlo analysis of the accelerator-driven system at Kyoto University Research Reactor Institute
Energy Technology Data Exchange (ETDEWEB)
Kim, Won Kyeong; Lee, Deok Jung [Nuclear Engineering Division, Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of); Lee, Hyun Chul [VHTR Technology Development Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Pyeon, Cheol Ho [Nuclear Engineering Science Division, Kyoto University Research Reactor Institute, Osaka (Japan); Shin, Ho Cheol [Core and Fuel Analysis Group, Korea Hydro and Nuclear Power Central Research Institute, Daejeon (Korea, Republic of)
2016-04-15
An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan), a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft-Walton type accelerator, which generates the external neutron source by deuterium-tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.
Monte Carlo Analysis of the Accelerator-Driven System at Kyoto University Research Reactor Institute
Directory of Open Access Journals (Sweden)
Wonkyeong Kim
2016-04-01
Full Text Available An accelerator-driven system consists of a subcritical reactor and a controllable external neutron source. The reactor in an accelerator-driven system can sustain fission reactions in a subcritical state using an external neutron source, which is an intrinsic safety feature of the system. The system can provide efficient transmutations of nuclear wastes such as minor actinides and long-lived fission products and generate electricity. Recently at Kyoto University Research Reactor Institute (KURRI; Kyoto, Japan, a series of reactor physics experiments was conducted with the Kyoto University Critical Assembly and a Cockcroft–Walton type accelerator, which generates the external neutron source by deuterium–tritium reactions. In this paper, neutronic analyses of a series of experiments have been re-estimated by using the latest Monte Carlo code and nuclear data libraries. This feasibility study is presented through the comparison of Monte Carlo simulation results with measurements.
Energy Technology Data Exchange (ETDEWEB)
Alves-Foss, J.; Levitt, K.
1991-01-01
In this paper we present a generalization of McCullough's restrictiveness model as the basis for proving security properties about distributed system designs. We mechanize this generalization and an event-based model of computer systems in the HOL (Higher Order Logic) system to prove the composability of the model and several other properties about the model. We then develop a set of generalized classes of system components and show for which families of user views they satisfied the model. Using these classes we develop a collection of general system components that are instantiations of one of these classes and show that the instantiations also satisfied the security property. We then conclude with a sample distributed secure system, based on the Rushby and Randell distributed system design and designed using our collection of components, and show how our mechanized verification system can be used to verify such designs. 16 refs., 20 figs.
A Markov Chain Monte Carlo Based Method for System Identification
Energy Technology Data Exchange (ETDEWEB)
Glaser, R E; Lee, C L; Nitao, J J; Hanley, W G
2002-10-22
This paper describes a novel methodology for the identification of mechanical systems and structures from vibration response measurements. It combines prior information, observational data and predictive finite element models to produce configurations and system parameter values that are most consistent with the available data and model. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The resulting process enables the estimation of distributions of both individual parameters and system-wide states. Attractive features of this approach include its ability to: (1) provide quantitative measures of the uncertainty of a generated estimate; (2) function effectively when exposed to degraded conditions including: noisy data, incomplete data sets and model misspecification; (3) allow alternative estimates to be produced and compared, and (4) incrementally update initial estimates and analysis as more data becomes available. A series of test cases based on a simple fixed-free cantilever beam is presented. These results demonstrate that the algorithm is able to identify the system, based on the stiffness matrix, given applied force and resultant nodal displacements. Moreover, it effectively identifies locations on the beam where damage (represented by a change in elastic modulus) was specified.
MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM
Directory of Open Access Journals (Sweden)
Gabriela Ižaríková
2015-12-01
Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.
Monte Carlo Simulation of Magnetic System in the Tsallis Statistics
1999-01-01
We apply the Broad Histogram Method to an Ising system in the context of the recently reformulated Generalized Thermostatistics, and we claim it to be a very efficient simulation tool for this non-extensive statistics. Results are obtained for the nearest-neighbour version of the Ising model for a range of values of the $q$ parameter of Generalized Thermostatistics. We found an evidence that the 2D-Ising model does not undergo phase transitions at finite temperatures except for the extensive ...
Verification of Relational Data-Centric Dynamic Systems with External Services
Hariri, Babak Bagheri; De Giacomo, Giuseppe; Deutsch, Alin; Montali, Marco
2012-01-01
Data-centric dynamic systems are systems where both the process controlling the dynamics and the manipulation of data are equally central. In this paper we study verification of (first-order) mu-calculus variants over relational data-centric dynamic systems, where data are represented by a full-fledged relational database, and the process is described in terms of atomic actions that evolve the database. The execution of such actions may involve calls to external services, providing fresh data inserted into the system. As a result such systems are typically infinite-state. We show that verification is undecidable in general, and we isolate notable cases, where decidability is achieved. Specifically we start by considering service calls that return values deterministically (depending only on passed parameters). We show that in a mu-calculus variant that preserves knowledge of objects appeared along a run we get decidability under the assumption that the fresh data introduced along a run are bounded, though they...
A comprehensive and real-time fingerprint verification system for embedded devices
Yeung, Hoi Wo; Moon, Yiu Sang; Chen, Jiansheng; Chan, Fai; Ng, Yuk Man; Chung, Hin Shun; Pun, Kwok Ho
2005-03-01
This paper describes an embedded multi-user login system based on fingerprint recognition. The system, built using the Sitsang development board and embedded Linux, implements all fingerprint acquisition, preprocessing, minutia extraction, match, identification, user registration, and template encryption on the board. By careful analysis of the accuracy requirement as well as the arithmetic precision to be used, we optimized the algorithms so that the whole system can work in real-time in the embedded environment based on Intel(R) PXA255 processor. The fingerprint verification, which is the core part of the system, is fully tested on a fingerprint database consists of 1149 fingerprint images. The result shows that we can achieve an accuracy of more than 95%. Field testing of 20 registered users has further proved the reliability of our system. The core part of our system, then embedded fingerprint authentication, can also be applied in many different embedded applications concerning security problems.
Energy Technology Data Exchange (ETDEWEB)
Abramov, B. M. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Alekseev, P. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Borodin, Yu. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Bulychjov, S. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Dukhovskoy, I. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Krutenkova, A. P. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Martemianov, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Matsyuk, M. A. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Turdakina, E. N. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Khanov, A. I. [Inst. of Theoretical and Experimental Physics (ITEP), Moscow (Russian Federation); Mashnik, Stepan Georgievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-03
Momentum spectra of hydrogen isotopes have been measured at 3.5° from ^{12}C fragmentation on a Be target. Momentum spectra cover both the region of fragmentation maximum and the cumulative region. Differential cross sections span five orders of magnitude. The data are compared to predictions of four Monte Carlo codes: QMD, LAQGSM, BC, and INCL++. There are large differences between the data and predictions of some models in the high momentum region. The INCL++ code gives the best and almost perfect description of the data.
Space Shuttle production verification motor 1 (PV-1) field joint protection system, volume 7
Wilkinson, J. P.
1990-01-01
The performance of the field joint protection system (FJPS) of the Space Shuttle Production Verification Motor 1 (PV-1), as evaluated by postfire hardware inspection. Compliance with the specifications is shown for the FJPS assembly and components. The simplified FJPS and field joint heaters performed nominally, maintaining all joint seal temperatures within the required range. One anomally was noted on the igniter-to-case joint heater during postfire inspection. The heater buckled off the surface in two areas, resulting in two hot spots on the heater and darkened heater insulation. The condition did not affect heater performance during ignition countdown and all igniter seals were maintained within required temperature limits.
System Design and In-orbit Verification of the HJ-1-C SAR Satellite
Directory of Open Access Journals (Sweden)
Zhang Run-ning
2014-06-01
Full Text Available HJ-1-C is a SAR satellite owned by the Chinese Environment and Natural Disaster Monitoring constellation, and works together with the optical satellites HJ-1-A/B for monitoring environment and natural disasters. In this paper, the system design and characteristics of the first Chinese civil SAR satellite are described. In addition, the interface relation between SAR payload and platform is studied. Meanwhile, the data transmission capability, attitude, power, and temperature control that support SAR imaging are reviewed. Finally, the corresponding in-orbit verification results are presented.
Design of the software development and verification system (SWDVS) for shuttle NASA study task 35
Drane, L. W.; Mccoy, B. J.; Silver, L. W.
1973-01-01
An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.
Gan, Zecheng
2013-01-01
Computer simulation with Monte Carlo is an important tool to investigate the function and equilibrium properties of many systems with biological and soft matter materials solvable in solvents. The appropriate treatment of long-range electrostatic interaction is essential for these charged systems, but remains a challenging problem for large-scale simulations. We have developed an efficient Barnes-Hut treecode algorithm for electrostatic evaluation in Monte Carlo simulations of Coulomb many-body systems. The algorithm is based on a divide-and-conquer strategy and fast update of the octree data structure in each trial move through a local adjustment procedure. We test the accuracy of the tree algorithm, and use it to computer simulations of electric double layer near a spherical interface. It has been shown that the computational cost of the Monte Carlo method with treecode acceleration scales as $\\log N$ in each move. For a typical system with ten thousand particles, by using the new algorithm, the speed has b...
Coarse-grained stochastic processes and Monte Carlo simulations in lattice systems
Katsoulakis, M A; Vlachos, D G
2003-01-01
In this paper we present a new class of coarse-grained stochastic processes and Monte Carlo simulations, derived directly from microscopic lattice systems and describing mesoscopic length scales. As our primary example, we mainly focus on a microscopic spin-flip model for the adsorption and desorption of molecules between a surface adjacent to a gas phase, although a similar analysis carries over to other processes. The new model can capture large scale structures, while retaining microscopic information on intermolecular forces and particle fluctuations. The requirement of detailed balance is utilized as a systematic design principle to guarantee correct noise fluctuations for the coarse-grained model. We carry out a rigorous asymptotic analysis of the new system using techniques from large deviations and present detailed numerical comparisons of coarse-grained and microscopic Monte Carlo simulations. The coarse-grained stochastic algorithms provide large computational savings without increasing programming ...
Evaluation of the material assignment method used by a Monte Carlo treatment planning system.
Isambert, A; Brualla, L; Lefkopoulos, D
2009-12-01
An evaluation of the conversion process from Hounsfield units (HU) to material composition in computerised tomography (CT) images, employed by the Monte Carlo based treatment planning system ISOgray (DOSIsoft), is presented. A boundary in the HU for the material conversion between "air" and "lung" materials was determined based on a study using 22 patients. The dosimetric consequence of the new boundary was quantitatively evaluated for a lung patient plan.
Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit
Sartori, John
2005-01-01
The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.
The gyroscope testbed: A verification of the gravity probe B suspension system
Brumley, Robert Willard
The verification of precision control systems for use in space-based applications can be extremely challenging. Often, the presence of the 1-g field substantively changes the control problem, making it impossible to test directly on the Earth. This talk discusses a new approach to testing and verification of the gyroscope suspension system for the Gravity Probe B (GP-B) experimental test of General Relativity. The verification approach involves the creation of a new testbed that has the same input-output characteristics and dynamics as a GP-B gyroscope. This involves real physical hardware that moves like a real gyroscope, allowing the suspension system's performance to be measured directly without the need to break any internal connections or bypass internal subsystems. The user free to define any set of disturbances from a 1-g ground levitation to a 10-8 g science mission. The testbed has two main subsystems. The mechanical subsystem is comprised of six parallel plate capacitors whose spacing is controlled by precision actuators. These actuators are the physical interface to the suspension system and create the electrode-rotor capacitances present in a real gyroscope. The closed-loop positioning noise of the system is approximately 10 pm/√Hz, enabling the commanding of position variations a fraction the size of a single atom of Silicon. The control subsystem has a DSP-based high-speed nonlinear controller that forces the actuators to follow the dynamics of a gyroscope. The device has been shown to faithfully represent a gyroscope in 1-g levitation, and a robustness analysis has been performed to prove that it correctly tests the stability of the on-orbit system. The testbed is then used to measure directly suspension system performance in a variety of on-orbit scenarios. Gyroscope levitation in 10-8 g conditions is demonstrated. The robustness of gyroscope levitation to transient disturbances such as micrometeorite impacts on the space vehicle and transitions
Institute of Scientific and Technical Information of China (English)
范文玎; 孙光耀; 张彬航; 陈锐; 郝丽娟
2016-01-01
燃耗计算在反应堆设计、分析研究中起着重要作用.相比于传统点燃耗算法,切比雪夫有理逼近方法(Chebyshev rational approximation method,CRAM)具有计算速度快、精度高的优点.基于超级蒙特卡罗核计算仿真软件系统SuperMC(Super Monte Carlo Simulation Program for Nuclear and Radiation Process),采用切比雪夫有理逼近方法和桶排序能量查找方法,进行了蒙特卡罗燃耗计算的初步研究与验证.通过燃料棒燃耗例题以及IAEA-ADS(International Atomic Energy Agency-Accelerator Driven Systems)国际基准题,初步验证了该燃耗计算方法的正确性,且IAEA-ADS基准题测试表明,与统一能量网格方法相比,桶排序能量查找方法在保证了计算效率的同时减少了内存开销.%Background:Burnup calculation is the key point of reactor design and analysis. It's significant to calculate the burnup situation and isotopic atom density accurately while a reactor is being designed.Purpose:Based on the Monte Carlo particle simulation code SuperMC (Super Monte Carlo Simulation Program for Nuclear and Radiation Process), this paper aimed to conduct preliminary study and verification on Monte Carlo burnup calculations. Methods:For the characteristics of accuracy, this paper adopted Chebyshev rational approximation method (CRAM) as the point-burnup algorithm. Moreover, instead of the union energy grids method, this paper adopted an energy searching method based on bucket sort algorithm, which reduced the memory overhead on the condition that the calculation efficiency is ensured.Results:By calculating the fuel rod burnup problem and the IAEA-ADS (International Atomic Energy Agency - Accelerator Driven Systems) international benchmark, the simulation results were basically consistent with Serpent and other counties' results, respectively. In addition, the bucket sort energy searching method reduced about 95% storage space compared with union energy grids method for IAEA
Energy Technology Data Exchange (ETDEWEB)
SHOOP, D.S.
1999-09-10
The Department of Energy policy (DOE P 450.4) is that safety is integrated into all aspects of the management and operations of its facilities. In simple and straightforward terms, the Department will ''Do work safely.'' The purpose of this River Protection Project (RPP) Integrated Safety Management System (ISMS) Phase II Verification was to determine whether ISMS programs and processes are implemented within RFP to accomplish the goal of ''Do work safely.'' The goal of an implemented ISMS is to have a single integrated system that includes Environment, Safety, and Health (ES&H) requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and federal property over the RPP life cycle. The ISMS is comprised of the (1) described functions, components, processes, and interfaces (system map or blueprint) and (2) personnel who are executing those assigned roles and responsibilities to manage and control the ISMS. Therefore, this review evaluated both the ''paper'' and ''people'' aspects of the ISMS to ensure that the system is implemented within RPP. Richland Operations Office (RL) conducted an ISMS Phase I Verification of the TWRS from September 28-October 9, 1998. The resulting verification report recommended that TWRS-RL and the contractor proceed with Phase II of ISMS verification given that the concerns identified from the Phase I verification review are incorporated into the Phase II implementation plan.
Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems
Ndukwu, Ukachukwu
2009-01-01
This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...
Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification
Shipworth, David
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Requirement Assurance: A Verification Process
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Proceedings 7th International Workshop on Automated Specification and Verification of Web Systems
Kovacs, Laura; Tiezzi, Francesco
2011-01-01
This volume contains the final and revised versions of the papers presented at the 7th International Workshop on Automated Specification and Verification of Web Systems (WWV 2011). The workshop was held in Reykjavik, Iceland, on June 9, 2011, as part of DisCoTec 2011. The aim of the WWV workshop series is to provide an interdisciplinary forum to facilitate the cross-fertilization and the advancement of hybrid methods that exploit concepts and tools drawn from Rule-based programming, Software engineering, Formal methods and Web-oriented research. Nowadays, indeed, many companies and institutions have diverted their Web sites into interactive, completely-automated, Web-based applications for, e.g., e-business, e-learning, e-government, and e-health. The increased complexity and the explosive growth of Web systems have made their design and implementation a challenging task. Systematic, formal approaches to their specification and verification can permit to address the problems of this specific domain by means o...
Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method
Directory of Open Access Journals (Sweden)
Shaoyun Ge
2014-01-01
Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.
Comparing Subspace Methods for Closed Loop Subspace System Identification by Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
David Di Ruscio
2009-10-01
Full Text Available A novel promising bootstrap subspace system identification algorithm for both open and closed loop systems is presented. An outline of the SSARX algorithm by Jansson (2003 is given and a modified SSARX algorithm is presented. Some methods which are consistent for closed loop subspace system identification presented in the literature are discussed and compared to a recently published subspace algorithm which works for both open as well as for closed loop data, i.e., the DSR_e algorithm as well as the bootstrap method. Experimental comparisons are performed by Monte Carlo simulations.
Velazquez, L.; Castro-Palacio, J. C.
2013-07-01
Recently, Velazquez and Curilef proposed a methodology to extend Monte Carlo algorithms based on a canonical ensemble which aims to overcome slow sampling problems associated with temperature-driven discontinuous phase transitions. We show in this work that Monte Carlo algorithms extended with this methodology also exhibit a remarkable efficiency near a critical point. Our study is performed for the particular case of a two-dimensional four-state Potts model on a square lattice with periodic boundary conditions. This analysis reveals that the extended version of Metropolis importance sampling is more efficient than the usual Swendsen-Wang and Wolff cluster algorithms. These results demonstrate the effectiveness of this methodology to improve the efficiency of MC simulations of systems that undergo any type of temperature-driven phase transition.
Takahashi, F; Endo, A
2007-01-01
A system utilising radiation transport codes has been developed to derive accurate dose distributions in a human body for radiological accidents. A suitable model is quite essential for a numerical analysis. Therefore, two tools were developed to setup a 'problem-dependent' input file, defining a radiation source and an exposed person to simulate the radiation transport in an accident with the Monte Carlo calculation codes-MCNP and MCNPX. Necessary resources are defined by a dialogue method with a generally used personal computer for both the tools. The tools prepare human body and source models described in the input file format of the employed Monte Carlo codes. The tools were validated for dose assessment in comparison with a past criticality accident and a hypothesized exposure.
The grout/glass performance assessment code system (GPACS) with verification and benchmarking
Energy Technology Data Exchange (ETDEWEB)
Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.
1994-12-01
GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user`s guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACS is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway.
Vibratory response modeling and verification of a high precision optical positioning system.
Energy Technology Data Exchange (ETDEWEB)
Barraza, J.; Kuzay, T.; Royston, T. J.; Shu, D.
1999-06-18
A generic vibratory-response modeling program has been developed as a tool for designing high-precision optical positioning systems. Based on multibody dynamics theory, the system is modeled as rigid-body structures connected by linear elastic elements, such as complex actuators and bearings. The full dynamic properties of each element are determined experimentally or theoretically, then integrated into the program as inertial and stiffness matrices. Utilizing this program, the theoretical and experimental verification of the vibratory behavior of a double-multilayer monochromator support and positioning system is presented. Results of parametric design studies that investigate the influence of support floor dynamics and highlight important design issues are also presented. Overall, good matches between theory and experiment demonstrate the effectiveness of the program as a dynamic modeling tool.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
So, S.J.; Kim, J.; Kim, J.H.
2004-01-01
This paper presents a neural network based verification method in an HMMbased online Korean handwriting recognition system. It penalizes unreasonable grapheme hypotheses and complements global and structural information to the HMMbased recognition system, which is intrinsically based on local inf
The Environmental Technology Verification report discusses the technology and performance of the Static Pac System, Phase II, natural gas reciprocating compressor rod packing manufactured by the C. Lee Cook Division, Dover Corporation. The Static Pac System is designed to seal th...
Energy Technology Data Exchange (ETDEWEB)
L. M. Dittmer
2007-04-26
The 1607-F3 waste site is the former location of the sanitary sewer system that supported the 182-F Pump Station, the 183-F Water Treatment Plant, and the 151-F Substation. The sanitary sewer system included a septic tank, drain field, and associated pipeline, all in use between 1944 and 1965. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
An independent system for real-time dynamic multileaf collimation trajectory verification using EPID
Fuangrod, Todsaporn; Woodruff, Henry C.; Rowshanfarzad, Pejman; O'Connor, Daryl J.; Middleton, Richard H.; Greer, Peter B.
2014-01-01
A new tool has been developed to verify the trajectory of dynamic multileaf collimators (MLCs) used in advanced radiotherapy techniques using only the information provided by the electronic portal imaging devices (EPID) measured image frames. The prescribed leaf positions are resampled to a higher resolution in a pre-processing stage to improve the verification precision. Measured MLC positions are extracted from the EPID frames using a template matching method. A cosine similarity metric is then applied to synchronise measured and planned leaf positions for comparison. Three additional comparison functions were incorporated to ensure robust synchronisation. The MLC leaf trajectory error detection was simulated for both intensity modulated radiation therapy (IMRT) (prostate) and volumetric modulated arc therapy (VMAT) (head-and-neck) deliveries with anthropomorphic phantoms in the beam. The overall accuracy for MLC positions automatically extracted from EPID image frames was approximately 0.5 mm. The MLC leaf trajectory verification system can detect leaf position errors during IMRT and VMAT with a tolerance of 3.5 mm within 1 s.
Wieslander, Elinore; Knöös, Tommy
2003-10-01
An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox® (a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants.
DEFF Research Database (Denmark)
Vu, Linh Hong
This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration data....... The proposed method is based on a combination of formal methods and domain-specific approaches. While formal methods offer mathematically rigorous specification, verification and validation, domain-specific approaches encapsulate the use of formal methods with familiar concepts and notions of the domain, hence...... making the method easy for the railway engineers to use. Furthermore, the method features a 4-step verification and validation approach that can be integrated naturally into different phases of the software development process. This 4-step approach identifies possible errors in generic applications...
Watanabe, Hiroshi; Yukawa, Satoshi; Novotny, M A; Ito, Nobuyasu
2006-08-01
We construct asymptotic arguments for the relative efficiency of rejection-free Monte Carlo (MC) methods compared to the standard MC method. We find that the efficiency is proportional to exp(constbeta) in the Ising, sqrt[beta] in the classical XY, and beta in the classical Heisenberg spin systems with inverse temperature beta, regardless of the dimension. The efficiency in hard particle systems is also obtained, and found to be proportional to (rho(cp)-rho)(-d) with the closest packing density rho(cp), density rho, and dimension d of the systems. We construct and implement a rejection-free Monte Carlo method for the hard-disk system. The RFMC has a greater computational efficiency at high densities, and the density dependence of the efficiency is as predicted by our arguments.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco
2016-09-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477
A physical zero-knowledge object-comparison system for nuclear warhead verification.
Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
Directory of Open Access Journals (Sweden)
José Meseguer
2010-09-01
Full Text Available Distributed embedded systems (DESs are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.
Verification of the CFD simulation system SAUNA for complex aircraft configurations
Shaw, Jonathon A.; Peace, Andrew J.; May, Nicholas E.; Pocock, Mark F.
1994-04-01
This paper is concerned with the verification for complex aircraft configurations of an advanced CFD simulation system known by the acronym SAUNA. A brief description of the complete system is given, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the addressed configuration. The majority of the paper focuses on the application of SAUNA to a variety of configurations from the military aircraft, civil aircraft and missile areas. Mesh generation issues are discussed for each geometry and experimental data are used to assess the accuracy of the inviscid (Euler) model used. It is shown that flexibility and accuracy are combined in an efficient manner, thus demonstrating the value of SAUNA in aerodynamic design.
Ölveczky, Peter Csaba; 10.4204/EPTCS.36.8
2010-01-01
Distributed embedded systems (DESs) are no longer the exception; they are the rule in many application areas such as avionics, the automotive industry, traffic systems, sensor networks, and medical devices. Formal DES specification and verification is challenging due to state space explosion and the need to support real-time features. This paper reports on an extensive industry-based case study involving a DES product family for a pedestrian and car 4-way traffic intersection in which autonomous devices communicate by asynchronous message passing without a centralized controller. All the safety requirements and a liveness requirement informally specified in the requirements document have been formally verified using Real-Time Maude and its model checking features.
Energy Technology Data Exchange (ETDEWEB)
MCGREW, D.L.
1999-09-28
This Requirements Verification Report (RVR) for Project W-314 ''AN Farm to 200E Waste Transfer System'' package provides documented verification of design compliance to all the applicable Project Development Specification (PDS) requirements. Additional PDS requirements verification will be performed during the project's procurement, construction, and testing phases, and the RVR will be updated to reflect this information as appropriate.
Energy Technology Data Exchange (ETDEWEB)
Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.
2013-07-01
The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
CAD-based Monte Carlo Program for Integrated Simulation of Nuclear System SuperMC
Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin
2014-06-01
Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as routine method for nuclear design and analysis in the future. High fidelity simulation with MC method coupled with multi-physical phenomenon simulation has significant impact on safety, economy and sustainability of nuclear systems. However, great challenges to current MC methods and codes prevent its application in real engineering project. SuperMC is a CAD-based Monte Carlo program for integrated simulation of nuclear system developed by FDS Team, China, making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC were presented in this paper. SuperMC2.1, the latest version for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. SuperMC is still in its evolution process toward a general and routine tool for nuclear system. Warning, no authors found for 2014snam.conf06023.
Finite Size Effect in Path Integral Monte Carlo Simulations of 4He Systems
Institute of Scientific and Technical Information of China (English)
ZHAO Xing-Wen; CHENG Xin-Lu
2008-01-01
Path integral Monte Carlo (PIMC) simulations are a powerful computational method to study interacting quantum systems at finite temperatures. In this work, PIMC has been applied to study the finite size effect of the simulated systems of 4He. We determine the energy as a function of temperature at saturated-vapor-pressure (SVP) conditions in the temperature range of T ∈ [1.0 K,4.0 K], and the equation of state (EOS) in the ground state for systems consisted of 32, 64 and 128 4He atoms, respectively. We find that the energy at SVP is influenced significantly by the size of the simulated system in the temperature range of T ∈ [2.1 K, 3.0 K] and the larger the system is, the better results are obtained in comparison with the experimental values; while the EOS appeared to be unrelated to it.
A ROBUST GA/KNN BASED HYPOTHESIS VERIFICATION SYSTEM FOR VEHICLE DETECTION
Directory of Open Access Journals (Sweden)
Nima Khairdoost
2015-03-01
Full Text Available Vehicle detection is an important issue in driver assistance systems and self-guided vehicles that includes two stages of hypothesis generation and verification. In the first stage, potential vehicles are hypothesized and in the second stage, all hypothesis are verified. The focus of this work is on the second stage. We extract Pyramid Histograms of Oriented Gradients (PHOG features from a traffic image as candidates of feature vectors to detect vehicles. Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA are applied to these PHOG feature vectors as dimension reduction and feature selection tools parallelly. After feature fusion, we use Genetic Algorithm (GA and cosine similarity-based K Nearest Neighbor (KNN classification to improve the performance and generalization of the features. Our tests show good classification accuracy of more than 97% correct classification on realistic on-road vehicle images.
Formal Verification of a Secure Model for Building E-Learning Systems
Directory of Open Access Journals (Sweden)
Farhan M Al Obisat
2016-06-01
Full Text Available Internet is considered as common medium for E-learning to connect several parties with each other (instructors and students as they are supposed to be far away from each other. Both wired and wireless networks are used in this learning environment to facilitate mobile access to educational systems. This learning environment requires a secure connection and data exchange. An E-learning model was implemented and evaluated by conducting student’s experiments. Before the approach is deployed in the real world a formal verification for the model is completed which shows that unreachability case does not exist. The model in this paper which is concentrated on the security of e-content has successfully validated the model using SPIN Model Checker where no errors were found.
A physical zero-knowledge object comparison system for nuclear warhead verification
Philippe, Sébastien; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. In addition to the use of such a technique as part of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information, we provide definitive evidence that measuring sensitive data is not required to perform comparisons of physical properties.
Tank waste remediation system FSAR hazard identification/facility configuration verification report
Energy Technology Data Exchange (ETDEWEB)
Mendoza, D.P., Westinghouse Hanford
1996-05-01
This document provides the results of the Tank Waste Remediation System Final Safety Analysis Report (TWRS FSAR) hazards identification/facility configuration activities undertaken from the period of March 7, 1996 to May 31, 1996. The purpose of this activity was to provide an independent overview of the TWRS facility specific hazards and configurations that were used in support of the TWRS FSAR hazards and accident analysis development. It was based on a review of existing published documentation and field inspections. The objective of the verification effort was to provide a `snap shot` in time of the existing TWRS facility hazards and configurations and will be used to support hazards and accident analysis activities.
Event-chain Monte Carlo algorithms for hard-sphere systems.
Bernard, Etienne P; Krauth, Werner; Wilson, David B
2009-11-01
In this paper we present the event-chain algorithms, which are fast Markov-chain Monte Carlo methods for hard spheres and related systems. In a single move of these rejection-free methods, an arbitrarily long chain of particles is displaced, and long-range coherent motion can be induced. Numerical simulations show that event-chain algorithms clearly outperform the conventional Metropolis method. Irreversible versions of the algorithms, which violate detailed balance, improve the speed of the method even further. We also compare our method with a recent implementations of the molecular-dynamics algorithm.
Algorithm and application of Monte Carlo simulation for multi-dispersive copolymerization system
Institute of Scientific and Technical Information of China (English)
凌君; 沈之荃; 陈万里
2002-01-01
A Monte Carlo algorithm has been established for multi-dispersive copolymerization system, based on the experimental data of copolymer molecular weight and dispersion via GPC measurement. The program simulates the insertion of every monomer unit and records the structure and microscopical sequence of every chain in various lengths. It has been applied successfully for the ring-opening copolymerization of 2,2-dimethyltrimethylene carbonate (DTC) with (-caprolactone (ε-CL). The simulation coincides with the experimental results and provides microscopical data of triad fractions, lengths of homopolymer segments, etc., which are difficult to obtain by experiments. The algorithm presents also a uniform frame for copolymerization studies under other complicated mechanisms.
2010-07-01
... sampling system at the same time. If you use any analog or real-time digital filters during emission testing, you must operate those filters in the same manner during this verification. (2) Equipment setup... same time. In designing your experimental setup, avoid pressure pulsations due to stopping the...
The Environmental Technology Verification report discusses the technology and performance of the Plug Power SU1 Fuel Cell System manufactured by Plug Power. The SU1 is a proton exchange membrane fuel cell that requires hydrogen (H2) as fuel. H2 is generally not available, so the ...
Energy Technology Data Exchange (ETDEWEB)
Kent Norris
2010-02-01
The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE System Test Plan is to assess the approach to be taken for intended testing activities associated with the SAPHIRE software product. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.
The Environmental Technology Verification report discusses the technology and performance of Seal Assist System (SAS) for natural gas reciprocating compressor rod packing manufactured by A&A Environmental Seals, Inc. The SAS uses a secondary containment gland to collect natural g...
Blind receiver for OFDM systems via sequential Monte Carlo in factor graphs
Institute of Scientific and Technical Information of China (English)
CHEN Rong; ZHANG Hai-bin; XU You-yun; LIU Xin-zhao
2007-01-01
Estimation and detection algorithms for orthogonal frequency division multiplexing (OFDM) systems can be developed based on the sum-product algorithms, which operate by message passing in factor graphs. In this paper, we apply the sampling method (Monte Carlo) to factor graphs, and then the integrals in the sum-product algorithm can be approximated by sums, which results in complexity reduction. The blind receiver for OFDM systems can be derived via Sequential Monte Carlo(SMC) in factor graphs, the previous SMC blind receiver can be regarded as the special case of the sum-product algorithms using sampling methods. The previous SMC blind receiver for OFDM systems needs generating samples of the channel vector assuming the channel has an a priori Gaussian distribution. In the newly-built blind receiver, we generate samples of the virtual-pilots instead of the channel vector, with channel vector which can be easily computed based on virtual-pilots. As the size of the virtual-pilots space is much smaller than the channel vector space, only small number of samples are necessary, with the blind detection being much simpler. Furthermore, only one pilot tone is needed to resolve phase ambiguity and differential encoding is not used anymore. Finally, the results of computer simulations demonstrate that the proposal can perform well while providing significant complexity reduction.
Validation of MTF measurement for CBCT system using Monte Carlo simulations
Hao, Ting; Gao, Feng; Zhao, Huijuan; Zhou, Zhongxing
2016-03-01
To evaluate the spatial resolution performance of cone beam computed tomography (CBCT) system, accurate measurement of the modulation transfer function (MTF) is required. This accuracy depends on the MTF measurement method and CBCT reconstruction algorithms. In this work, the accuracy of MTF measurement of CBCT system using wire phantom is validated by Monte Carlo simulation. A Monte Carlo simulation software tool BEAMnrc/EGSnrc was employed to model X-ray radiation beams and transport. Tungsten wires were simulated with different diameters and radial distances from the axis of rotation. We adopted filtered back projection technique to reconstruct images from 360° acquisition. The MTFs for four reconstruction kernels were measured from corresponding reconstructed wire images, while the ram-lak kernel increased the MTF relative to the cosine, hamming and hann kernel. The results demonstrated that the MTF degraded radially from the axis of rotation. This study suggested that an increase in the MTF for the CBCT system is possible by optimizing scanning settings and reconstruction parameters.
Automated Verification of Memory Consistencies of DSM System on Unified Framework
Directory of Open Access Journals (Sweden)
Dr. Pankaj Kumar , Durgesh Kumar
2012-12-01
Full Text Available The consistency model of a DSM system specifies the ordering constraints on concurrent memory accesses by multiple processors, and hence has fundamental impact on DSM systems’ programming convenience and implementation efficiency. We have proposed the structural model for automated verification of memory consistencies of DSM System. DSM allows processes to assume a globally shared virtual memory even though they execute on nodes that do not physically share memory. The DSM software provide the abstraction of a globally shared memory in which each processor can access any data item without the programmer having to worry about where the data is or how to obtain its value In contrast in the native programming model on networks of workstations message passing the programmer must decide when a processor needs to communicate with whom to communicate and what data to be send. On a DSM system the programmer can focus on algorithmic development rather than on managing partitioned data sets and communicating values. The programming interfaces to DSM systems may differ in a variety of respects. The memory model refers to how updates to distributed shared memory are rejected to the processes in the system. The most intuitive model of distributed shared memory is that a read should always return the last value written unfortunately the notion of the last value written is not well defined in a distributed system.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.
Energy Technology Data Exchange (ETDEWEB)
Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F [UniversityKansas Medical Center, Kansas City, KS (United States)
2014-06-01
Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to
Sample Duplication Method for Monte Carlo Simulation of Large Reaction-Diffusion System
Institute of Scientific and Technical Information of China (English)
张红东; 陆建明; 杨玉良
1994-01-01
The sample duplication method for the Monte Carlo simulation of large reaction-diffusion system is proposed in this paper. It is proved that the sample duplication method will effectively raise the efficiency and statistical precision of the simulation without changing the kinetic behaviour of the reaction-diffusion system and the critical condition for the bifurcation of the steady-states. The method has been applied to the simulation of spatial and time dissipative structure of Brusselator under the Dirichlet boundary condition. The results presented in this paper definitely show that the sample duplication method provides a very efficient way to sol-’e the master equation of large reaction-diffusion system. For the case of two-dimensional system, it is found that the computation time is reduced at least by a factor of two orders of magnitude compared to the algorithm reported in literature.
A Quantum Monte Carlo Study of mono(benzene)TM and bis(benzene)TM Systems
Bennett, M Chandler; Mitas, Lubos
2016-01-01
We present a study of mono(benzene)TM and bis(benzene)TM systems, where TM={Mo,W}. We calculate the binding energies by quantum Monte Carlo (QMC) approaches and compare the results with other methods and available experiments. The orbitals for the determinantal part of each trial wave function were generated from several types of DFT in order to optimize for fixed-node errors. We estimate and compare the size of the fixed-node errors for both the Mo and W systems with regard to the electron density and degree of localization in these systems. For the W systems we provide benchmarking results of the binding energies, given that experimental data is not available.
A quantum Monte Carlo study of mono(benzene) TM and bis(benzene) TM systems
Bennett, M. Chandler; Kulahlioglu, A. H.; Mitas, L.
2017-01-01
We present a study of mono(benzene) TM and bis(benzene) TM systems, where TM = {Mo, W}. We calculate the binding energies by quantum Monte Carlo (QMC) approaches and compare the results with other methods and available experiments. The orbitals for the determinantal part of each trial wave function were generated from several types of DFT functionals in order to optimize for fixed-node errors. We estimate and compare the size of the fixed-node errors for both the Mo and W systems with regard to the electron density and degree of localization in these systems. For the W systems we provide benchmarking results of the binding energies, given that experimental data is not available.
Principle of Line Configuration and Monte-Carlo Simulation for Shared Multi-Channel System
Institute of Scientific and Technical Information of China (English)
MIAO Changyun; DAI Jufeng; BAI Zhihui
2005-01-01
Based on the steady-state solution of finite-state birth and death process, the principle of line configuration for shared multi-channel system is analyzed. Call congestion ratio equation and channel utilization ratio equation are deduced, and visualized data analysis is presented. The analy-sis indicates that, calculated with the proposed equations, the overestimate for call congestion ratio and channel utilization ratio can be rectified, and thereby the cost of channels can be saved by 20% in a small system.With MATLAB programming, line configuration methods are provided. In order to generally and intuitively show the dynamic running of the system, and to analyze,promote and improve it, the system is simulated using M/M/n/n/m queuing model and Monte-Carlo method. In addition, the simulation validates the correctness of the theoretical analysis and optimizing configuration method.
Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code
Merheb, C.; Petegnief, Y.; Talbot, J. N.
2007-02-01
Positron emission tomography (PET) systems dedicated to animal imaging are now widely used for biological studies. The scanner performance strongly depends on the design and the characteristics of the system. Many parameters must be optimized like the dimensions and type of crystals, geometry and field-of-view (FOV), sampling, electronics, lightguide, shielding, etc. Monte Carlo modelling is a powerful tool to study the effect of each of these parameters on the basis of realistic simulated data. Performance assessment in terms of spatial resolution, count rates, scatter fraction and sensitivity is an important prerequisite before the model can be used instead of real data for a reliable description of the system response function or for optimization of reconstruction algorithms. The aim of this study is to model the performance of the Philips Mosaic™ animal PET system using a comprehensive PET simulation code in order to understand and describe the origin of important factors that influence image quality. We use GATE, a Monte Carlo simulation toolkit for a realistic description of the ring PET model, the detectors, shielding, cap, electronic processing and dead times. We incorporate new features to adjust signal processing to the Anger logic underlying the Mosaic™ system. Special attention was paid to dead time and energy spectra descriptions. Sorting of simulated events in a list mode format similar to the system outputs was developed to compare experimental and simulated sensitivity and scatter fractions for different energy thresholds using various models of phantoms describing rat and mouse geometries. Count rates were compared for both cylindrical homogeneous phantoms. Simulated spatial resolution was fitted to experimental data for 18F point sources at different locations within the FOV with an analytical blurring function for electronic processing effects. Simulated and measured sensitivities differed by less than 3%, while scatter fractions agreed
The COSMO-LEPS mesoscale ensemble system: validation of the methodology and verification
Directory of Open Access Journals (Sweden)
C. Marsigli
2005-01-01
Full Text Available The limited-area ensemble prediction system COSMO-LEPS has been running every day at ECMWF since November 2002. A number of runs of the non-hydrostatic limited-area model Lokal Modell (LM are available every day, nested on members of the ECMWF global ensemble. The limited-area ensemble forecasts range up to 120h and LM-based probabilistic products are disseminated to several national and regional weather services. Some changes of the operational suite have recently been made, on the basis of the results of a statistical analysis of the methodology. The analysis is presented in this paper, showing the benefit of increasing the number of ensemble members. The system has been designed to have a probabilistic support at the mesoscale, focusing the attention on extreme precipitation events. In this paper, the performance of COSMO-LEPS in forecasting precipitation is presented. An objective verification in terms of probabilistic indices is made, using a dense network of observations covering a part of the COSMO domain. The system is compared with ECMWF EPS, showing an improvement of the limited-area high-resolution system with respect to the global ensemble system in the forecast of high precipitation values. The impact of the use of different schemes for the parametrisation of the convection in the limited-area model is also assessed, showing that this have a minor impact with respect to run the model with different initial and boundary condition.
Energy Technology Data Exchange (ETDEWEB)
Nord, B.; et al.
2015-12-09
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey (DES) data. Through visual inspection of data from the Science Verification (SV) season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-Object Spectrograph (GMOS) at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph (IMACS) at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: Three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 were either not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy cluster-scale lenses. The lensed sources range in redshift z ~ 0.80-3.2, and in i-band surface brightness i_{SB} ~ 23-25 mag/sq.-arcsec. (2" aperture). For each of the six systems, we estimate the Einstein radius and the enclosed mass, which have ranges ~ 5.0 - 8.6" and ~ 7.5 x 10^{12} - 6.4 x 10^{13} solar masses, respectively.
Architecture and critical technologies of seismic information system in CTBT verification
Institute of Scientific and Technical Information of China (English)
ZHENG Xue-feng; SHEN Jun-yi; JIN Ping; ZHENG Jiang-ling; SUN Peng; ZHANG Hui-min; WANG Tong-dong
2006-01-01
Seismic monitoring is one of the most important approaches for ground-based nuclear explosion monitoring. In order to improve the monitoring capability for low magnitude seismic events, a seismic information system was developed by using the technologies of geographic information system and database. This paper describes the designing and critical technologies of the Seismic Information System in CTBT Verification developed based on ArcGIS and ORACLE platforms. It is a combination of the database storage framework, application programming interface and graphic application software for users to meet their monitoring objectives. Combining the ArcSDE Geodatabase, RDBMS ORACLE and ArcObjects developing technique on COM, not only the multi-sources data has been seamlessly integrated, but also the most functions of ORACLE, for example, consistency, concurrent access, security mechanism, etc, have been reserved. For easy access to the information system we develop two different mechanisms. The first is a menu-driven internal system that is run on NT platforms. The second access mechanism is based on LAN and easily accessible by any web browsers.
Energy Technology Data Exchange (ETDEWEB)
Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu, E-mail: rluciane@ird.gov.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Energia Nuclear; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W., E-mail: karla@ird.gov.b [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes (LNMRI). Lab. de Neutrons
2009-07-01
The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response
Energy Technology Data Exchange (ETDEWEB)
Serena, P. A. [Instituto de Ciencias de Materiales de Madrid, Madrid (Spain); Costa-Kraemer, J. L. [Instituto de Microelectronica de Madrid, Madrid (Spain)
2001-03-01
A Monte Carlo algorithm suitable to study systems described by an anisotropic Heisenberg Hamiltonian is presented. This technique has been tested successfully with 3D and 2D systems, illustrating how magnetic properties depend on the dimensionality and the coordination number. We have found that magnetic properties of constrictions differ from those appearing in bulk. In particular, spin fluctuations are considerable larger than those calculated for bulk materials. In addition, domain walls are strongly modified when a constriction is present, with a decrease of the domain-wall width. This decrease is explained in terms of previous theoretical works. [Spanish] Se presenta un algoritmo de Monte Carlo para estudiar sistemas discritos por un hamiltoniano anisotropico de Heisenburg. Esta tecnica ha sido probada exitosamente con sistemas de dos y tres dimensiones, ilustrado con las propiedades magneticas dependen de la dimensionalidad y el numero de coordinacion. Hemos encontrado que las propiedades magneticas de constricciones difieren de aquellas del bulto. En particular, las fluctuaciones de espin son considerablemente mayores. Ademas, las paredes de dominio son fuertemente modificadas cuando una construccion esta presente, originando un decrecimiento del ancho de la pared de dominio. Damos cuenta de este decrecimiento en terminos de un trabajo teorico previo.
Techno-economic and Monte Carlo probabilistic analysis of microalgae biofuel production system.
Batan, Liaw Y; Graff, Gregory D; Bradley, Thomas H
2016-11-01
This study focuses on the characterization of the technical and economic feasibility of an enclosed photobioreactor microalgae system with annual production of 37.85 million liters (10 million gallons) of biofuel. The analysis characterizes and breaks down the capital investment and operating costs and the production cost of unit of algal diesel. The economic modelling shows total cost of production of algal raw oil and diesel of $3.46 and $3.69 per liter, respectively. Additionally, the effects of co-products' credit and their impact in the economic performance of algal-to-biofuel system are discussed. The Monte Carlo methodology is used to address price and cost projections and to simulate scenarios with probabilities of financial performance and profits of the analyzed model. Different markets for allocation of co-products have shown significant shifts for economic viability of algal biofuel system.
Energy Technology Data Exchange (ETDEWEB)
Zucca Aparcio, D.; Perez Moreno, J. M.; Fernandez Leton, P.; Garcia Ruiz-Zorrila, J.
2016-10-01
The commissioning procedures of a Monte Carlo treatment planning system (MC) for photon beams from a dedicated stereotactic body radiosurgery (SBRT) unit has been reported in this document. XVMC has been the MC Code available in the treatment planning system evaluated (BrainLAB iPlan RT Dose) which is based on Virtual Source Models that simulate the primary and scattered radiation, besides the electronic contamination, using gaussian components for whose modelling are required measurements of dose profiles, percentage depth dose and output factors, performed both in water and in air. The dosimetric accuracy of the particle transport simulation has been analyzed by validating the calculations in homogeneous and heterogeneous media versus measurements made under the same conditions as the dose calculation, and checking the stochastic behaviour of Monte Carlo calculations when using different statistical variances. Likewise, it has been verified how the planning system performs the conversion from dose to medium to dose to water, applying the stopping power ratio water to medium, in the presence of heterogeneities where this phenomenon is relevant, such as high density media (cortical bone). (Author)
Nord, B; Lin, H; Diehl, H T; Helsby, J; Kuropatkin, N; Amara, A; Collett, T; Allam, S; Caminha, G; De Bom, C; Desai, S; Dúmet-Montoya, H; Pereira, M Elidaiana da S; Finley, D A; Flaugher, B; Furlanetto, C; Gaitsch, H; Gill, M; Merritt, K W; More, A; Tucker, D; Rykoff, E S; Rozo, E; Abdalla, F B; Agnello, A; Auger, M; Brunner, R J; Kind, M Carrasco; Castander, F J; Cunha, C E; da Costa, L N; Foley, R; Gerdes, D W; Glazebrook, K; Gschwend, J; Hartley, W; Kessler, R; Lagattuta, D; Lewis, G; Maia, M A G; Makler, M; Menanteau, F; Niernberg, A; Scolnic, D; Vieira, J D; Gramillano, R; Abbott, T M C; Banerji, M; Benoit-Lévy, A; Brooks, D; Burke, D L; Capozzi, D; Rosell, A Carnero; Carretero, J; D'Andrea, C B; Dietrich, J P; Doel, P; Evrard, A E; Frieman, J; Gaztanaga, E; Gruen, D; Honscheid, K; James, D J; Kuehn, K; Li, T S; Lima, M; Marshall, J L; Martini, P; Melchior, P; Miquel, R; Neilsen, E; Nichol, R C; Ogando, R; Plazas, A A; Romer, A K; Sako, M; Sanchez, E; Scarpine, V; Schubnell, M; Sevilla-Noarbe, I; Smith, R C; Soares-Santos, M; Sobreira, F; Suchyta, E; Swanson, M E C; Tarle, G; Thaler, J; Walker, A R; Wester, W; Zhang, Y
2015-01-01
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey (DES) data. Through visual inspection of data from the Science Verification (SV) season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-Object Spectrograph (GMOS) at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph (IMACS) at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: Three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 were either not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy cluster-scale lenses. The lensed sources range in redshift z ~ 0.80-3.2...
Blajer, W.; Dziewiecki, K.; Kołodziejczyk, K.; Mazur, Z.
2011-05-01
Underactuated systems are featured by fewer control inputs than the degrees-of-freedom, m system to complete a set of m specified motion tasks is a challenging task, and the explicit solution existence is conditioned to differential flatness of the problem. The flatness-based solution denotes that all the 2 n states and m control inputs can be algebraically expressed in terms of the m specified outputs and their time derivatives up to a certain order, which is in practice attainable only for simple systems. In this contribution the problem is posed in a more practical way as a set of index-three differential-algebraic equations, and the solution is obtained numerically. The formulation is then illustrated by a two-degree-of-freedom underactuated system composed of two rotating discs connected by a torsional spring, in which the pre-specified motion of one of the discs is actuated by the torque applied to the other disc, n = 2 and m = 1. Experimental verification of the inverse simulation control methodology is reported.
ESTERR-PRO: A Setup Verification Software System Using Electronic Portal Imaging
Directory of Open Access Journals (Sweden)
Pantelis A. Asvestas
2007-01-01
Full Text Available The purpose of the paper is to present and evaluate the performance of a new software-based registration system for patient setup verification, during radiotherapy, using electronic portal images. The estimation of setup errors, using the proposed system, can be accomplished by means of two alternate registration methods. (a The portal image of the current fraction of the treatment is registered directly with the reference image (digitally reconstructed radiograph (DRR or simulator image using a modified manual technique. (b The portal image of the current fraction of the treatment is registered with the portal image of the first fraction of the treatment (reference portal image by applying a nearly automated technique based on self-organizing maps, whereas the reference portal has already been registered with a DRR or a simulator image. The proposed system was tested on phantom data and on data from six patients. The root mean square error (RMSE of the setup estimates was 0.8±0.3 (mean value ± standard deviation for the phantom data and 0.3±0.3 for the patient data, respectively, by applying the two methodologies. Furthermore, statistical analysis by means of the Wilcoxon nonparametric signed test showed that the results that were obtained by the two methods did not differ significantly (P value >0.05.
Belcastro, Christine M.
2010-01-01
Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.
Monte Carlo ﬁlters for identiﬁcation of nonlinear structural dynamical systems
Indian Academy of Sciences (India)
C S Manohar; D Roy
2006-08-01
The problem of identiﬁcation of parameters of nonlinear structures using dynamic state estimation techniques is considered. The process equations are derived based on principles of mechanics and are augmented by mathematical models that relate a set of noisy observations to state variables of the system. The set of structural parameters to be identiﬁed is declared as an additional set of state variables. Both the process equation and the measurement equations are taken to be nonlinear in the state variables and contaminated by additive and (or) multiplicative Gaussian white noise processes. The problem of determining the posterior probability density function of the state variables conditioned on all available information is considered. The utility of three recursive Monte Carlo simulation-based ﬁlters, namely, a probability density function-based Monte Carlo ﬁlter, a Bayesian bootstrap ﬁlter and a ﬁlter based on sequential importance sampling, to solve this problem is explored. The state equations are discretized using certain variations of stochastic Taylor expansions enabling the incorporation of a class of non-smooth functions within the process equations. Illustrative examples on identiﬁcation of the nonlinear stiffness parameter of a Dufﬁng oscillator and the friction parameter in a Coulomb oscillator are presented.
Quantum Monte Carlo of atomic and molecular systems with heavy elements
Mitas, Lubos; Kulahlioglu, Adem; Melton, Cody; Bennett, Chandler
2015-03-01
We carry out quantum Monte Carlo calculations of atomic and molecular systems with several heavy atoms such as Mo, W and Bi. In particular, we compare the correlation energies vs their lighter counterparts in the same column of the periodic table in order to reveal trends with regard to the atomic number Z. One of the observations is that the correlation energy for the isoelectronic valence space/states is mildly decreasing with increasing Z. Similar observation applies also to the fixed-node errors, supporting thus our recent observation that the fixed-node error increases with electronic density for the same (or similar) complexity of the wave function and bonding. In addition, for Bi systems we study the impact of the spin-orbit on the electronic structure, in particular, on binding, correlation and excitation energies.
Space applications of the MITS electron-photon Monte Carlo transport code system
Energy Technology Data Exchange (ETDEWEB)
Kensek, R.P.; Lorence, L.J.; Halbleib, J.A. [Sandia National Labs., Albuquerque, NM (United States); Morel, J.E. [Los Alamos National Lab., NM (United States)
1996-07-01
The MITS multigroup/continuous-energy electron-photon Monte Carlo transport code system has matured to the point that it is capable of addressing more realistic three-dimensional adjoint applications. It is first employed to efficiently predict point doses as a function of source energy for simple three-dimensional experimental geometries exposed to simulated uniform isotropic planar sources of monoenergetic electrons up to 4.0 MeV. Results are in very good agreement with experimental data. It is then used to efficiently simulate dose to a detector in a subsystem of a GPS satellite due to its natural electron environment, employing a relatively complex model of the satellite. The capability for survivability analysis of space systems is demonstrated, and results are obtained with and without variance reduction.
Miming the cancer-immune system competition by kinetic Monte Carlo simulations
Bianca, Carlo; Lemarchand, Annie
2016-10-01
In order to mimic the interactions between cancer and the immune system at cell scale, we propose a minimal model of cell interactions that is similar to a chemical mechanism including autocatalytic steps. The cells are supposed to bear a quantity called activity that may increase during the interactions. The fluctuations of cell activity are controlled by a so-called thermostat. We develop a kinetic Monte Carlo algorithm to simulate the cell interactions and thermalization of cell activity. The model is able to reproduce the well-known behavior of tumors treated by immunotherapy: the first apparent elimination of the tumor by the immune system is followed by a long equilibrium period and the final escape of cancer from immunosurveillance.
Saha, Krishnendu; Straus, Kenneth J; Chen, Yu; Glick, Stephen J
2014-08-28
To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.
2015-03-13
From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber - Physical Systems ...ABSTRACT As the design complexity of cyber - physical systems continues to grow, modeling the system at higher abstraction levels with formal models of...Function-Architecture Co-simulation Framework for Timing Verification of Cyber - Physical Systems by Liangpeng Guo A dissertation submitted in partial
Energy Technology Data Exchange (ETDEWEB)
Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.
EPA has created the Environmental Technology Verification program to provide high quality, peer reviewed data on technology performance. This data is expected to accelerate the acceptance and use of improved environmental protection technologies. The Greenhouse Gas Technology C...
National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...
Jonathan L. Case; Kumar, Sujay V.; Srikishen, Jayanthi; Jedlovec, Gary J.
2010-01-01
One of the most challenging weather forecast problems in the southeastern U.S. is daily summertime pulse-type convection. During the summer, atmospheric flow and forcing are generally weak in this region; thus, convection typically initiates in response to local forcing along sea/lake breezes, and other discontinuities often related to horizontal gradients in surface heating rates. Numerical simulations of pulse convection usually have low skill, even in local predictions at high resolution, due to the inherent chaotic nature of these precipitation systems. Forecast errors can arise from assumptions within parameterization schemes, model resolution limitations, and uncertainties in both the initial state of the atmosphere and land surface variables such as soil moisture and temperature. For this study, it is hypothesized that high-resolution, consistent representations of surface properties such as soil moisture, soil temperature, and sea surface temperature (SST) are necessary to better simulate the interactions between the surface and atmosphere, and ultimately improve predictions of summertime pulse convection. This paper describes a sensitivity experiment using the Weather Research and Forecasting (WRF) model. Interpolated land and ocean surface fields from a large-scale model are replaced with high-resolution datasets provided by unique NASA assets in an experimental simulation: the Land Information System (LIS) and Moderate Resolution Imaging Spectroradiometer (MODIS) SSTs. The LIS is run in an offline mode for several years at the same grid resolution as the WRF model to provide compatible land surface initial conditions in an equilibrium state. The MODIS SSTs provide detailed analyses of SSTs over the oceans and large lakes compared to current operational products. The WRF model runs initialized with the LIS+MODIS datasets result in a reduction in the overprediction of rainfall areas; however, the skill is almost equally as low in both experiments using
Directory of Open Access Journals (Sweden)
Tuo Ming Fu
2016-01-01
Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion
An automatic dose verification system for adaptive radiotherapy for helical tomotherapy
Mo, Xiaohu; Chen, Mingli; Parnell, Donald; Olivera, Gustavo; Galmarini, Daniel; Lu, Weiguo
2014-03-01
verification system that quantifies treatment doses, and provides necessary information for adaptive planning without impeding clinical workflows.
Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat
2011-01-01
The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.
System Level Numerical Analysis of a Monte Carlo Simulation of the E. Coli Chemotaxis
Siettos, Constantinos I
2010-01-01
Over the past few years it has been demonstrated that "coarse timesteppers" establish a link between traditional numerical analysis and microscopic/ stochastic simulation. The underlying assumption of the associated lift-run-restrict-estimate procedure is that macroscopic models exist and close in terms of a few governing moments of microscopically evolving distributions, but they are unavailable in closed form. This leads to a system identification based computational approach that sidesteps the necessity of deriving explicit closures. Two-level codes are constructed; the outer code performs macroscopic, continuum level numerical tasks, while the inner code estimates -through appropriately initialized bursts of microscopic simulation- the quantities required for continuum numerics. Such quantities include residuals, time derivatives, and the action of coarse slow Jacobians. We demonstrate how these coarse timesteppers can be applied to perform equation-free computations of a kinetic Monte Carlo simulation of...
Kinetic Monte Carlo simulation of dopant-defect systems under submicrosecond laser thermal processes
Energy Technology Data Exchange (ETDEWEB)
Fisicaro, G.; Pelaz, Lourdes; Lopez, P.; Italia, M.; Huet, K.; Venturini, J.; La Magna, A. [CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy); Department of Electronics, University of Valladolid, 47011 Valladolid (Spain); CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy); Excico 13-21 Quai des Gresillons, 92230 Gennevilliers (France); CNR IMM, Z.I. VIII Strada 5, I -95121 Catania (Italy)
2012-11-06
An innovative Kinetic Monte Carlo (KMC) code has been developed, which rules the post-implant kinetics of the defects system in the extremely far-from-the equilibrium conditions caused by the laser irradiation close to the liquid-solid interface. It considers defect diffusion, annihilation and clustering. The code properly implements, consistently to the stochastic formalism, the fast varying local event rates related to the thermal field T(r,t) evolution. This feature of our numerical method represents an important advancement with respect to current state of the art KMC codes. The reduction of the implantation damage and its reorganization in defect aggregates are studied as a function of the process conditions. Phosphorus activation efficiency, experimentally determined in similar conditions, has been related to the emerging damage scenario.
Validation and simulation of a regulated survey system through Monte Carlo techniques
Directory of Open Access Journals (Sweden)
Asier Lacasta Soto
2015-07-01
Full Text Available Channel flow covers long distances and obeys to variable temporal behaviour. It is usually regulated by hydraulic elements as lateralgates to provide a correct of water supply. The dynamics of this kind of flow is governed by a partial differential equations systemnamed shallow water model. They have to be complemented with a simplified formulation for the gates. All the set of equations forma non-linear system that can only be solved numerically. Here, an explicit upwind numerical scheme in finite volumes able to solveall type of flow regimes is used. Hydraulic structures (lateral gates formulation introduces parameters with some uncertainty. Hence,these parameters will be calibrated with a Monte Carlo algorithm obtaining associated coefficients to each gate. Then, they will bechecked, using real cases provided by the monitorizing equipment of the Pina de Ebro channel located in Zaragoza.
Algorithm and application of Monte Carlo simulation for multi-dispersive copolymerization system
Institute of Scientific and Technical Information of China (English)
凌君; 沈之荃; 陈万里
2002-01-01
A Monte Carlo algorithm has been established for multi-dispersive copolymerization system, based on the experimental data of copolymer molecular weight and dispersion via GPC measurement. The program simulates the insertion of every monomer unit and records the structure and microscopical sequence of every chain in various lengths. It has been applied successfully for the ring-opening copolymerization of 2,2-dimethyltrimethylene carbonate (DTC) with δ-caprolactone (δ-CL). The simulation coincides with the experimental results and provides microscopical data of triad fractions, lengths of homopolymer segments, etc., which are difficult to obtain by experiments. The algorithm presents also a uniform frame for copolymerization studies under other complicated mechanisms.
Monte Carlo Studies for the Calibration System of the GERDA Experiment
Baudis, Laura; Froborg, Francis; Tarka, Michal
2013-01-01
The GERmanium Detector Array, GERDA, searches for neutrinoless double beta decay in Ge-76 using bare high-purity germanium detectors submerged in liquid argon. For the calibration of these detectors gamma emitting sources have to be lowered from their parking position on top of the cryostat over more than five meters down to the germanium crystals. With the help of Monte Carlo simulations, the relevant parameters of the calibration system were determined. It was found that three Th-228 sources with an activity of 20 kBq each at two different vertical positions will be necessary to reach sufficient statistics in all detectors in less than four hours of calibration time. These sources will contribute to the background of the experiment with a total of (1.07 +/- 0.04(stat) +0.13 -0.19(sys)) 10^{-4} cts/(keV kg yr) when shielded from below with 6 cm of tantalum in the parking position.
Monte Carlo simulation of glandular dose in a dedicated breast CT system
Institute of Scientific and Technical Information of China (English)
TANG Xiao; WEI Long; ZHAO Wei; WANG Yan-Fang; SHU Hang; SUN Cui-Li; WEI Cun-Feng; CAO Da-Quan; QUE Jie-Min; SHI Rong-Jian
2012-01-01
A dedicated breast CT system (DBCT) is a new method for breast cancer detection proposed in recent years.In this paper,the glandular dose in the DBCT is simulated using the Monte Carlo method.The phantom shape is half ellipsoid,and a series of phantoms with different sizes,shapes and compositions were constructed. In order to optimize the spectra,monoenergy X-ray beams of 5-80 keV were used in simulation.The dose distribution of a breast phantom was studied:a higher energy beam generated more uniform distribution,and the outer parts got more dose than the inner parts.For polyenergtic spectra,four spectra of Al filters with different thicknesses were simulated,and the polyenergtic glandular dose was calculated as a spectral weighted combination of the monoenergetic dose.
A Parallel Monte Carlo Code for Simulating Collisional N-body Systems
Pattabiraman, Bharath; Liao, Wei-Keng; Choudhary, Alok; Kalogera, Vassiliki; Memik, Gokhan; Rasio, Frederic A
2012-01-01
We present a new parallel code for computing the dynamical evolution of collisional N-body systems with up to N~10^7 particles. Our code is based on the the H\\'enon Monte Carlo method for solving the Fokker-Planck equation, and makes assumptions of spherical symmetry and dynamical equilibrium. The principal algorithmic developments involve optimizing data structures, and the introduction of a parallel random number generation scheme, as well as a parallel sorting algorithm, required to find nearest neighbors for interactions and to compute the gravitational potential. The new algorithms we introduce along with our choice of decomposition scheme minimize communication costs and ensure optimal distribution of data and workload among the processing units. The implementation uses the Message Passing Interface (MPI) library for communication, which makes it portable to many different supercomputing architectures. We validate the code by calculating the evolution of clusters with initial Plummer distribution functi...
Continuous verification using multimodal biometrics.
Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep
2007-04-01
Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system.
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Energy Technology Data Exchange (ETDEWEB)
L. M. Dittmer
2007-12-03
The 1607-F4 waste site is the former location of the sanitary sewer system that serviced the former 115-F Gas Recirculation Building. The system included a septic tank, drain field, and associated pipeline that were in use from 1944 to 1965. The 1607-F4 waste site received unknown amounts of sanitary sewage from the 115-F Gas Recirculation Building and may have potentially contained hazardous and radioactive contamination. In accordance with this evaluation, the verification sampling results support a reclassification of this site to Interim Closed Out. The results of verification sampling demonstrated that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also showed that residual contaminant concentrations are protective of groundwater and the Columbia River.
Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.
2013-09-01
A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.
HMM based Offline Signature Verification system using ContourletTransform and Textural features
Directory of Open Access Journals (Sweden)
K N PUSHPALATHA
2014-07-01
Full Text Available Handwritten signatures occupy a very special place in the identification of an individual and it is a challenging task because of the possible variations in directions and shapes of the constituent strokes of written samples. In this paper we investigated offline verifications system based on fusion of contourlet transform, directional features and Hidden Markov Model (HMM as classifier. The handwritten signature image is preprocessed for noise removal and a two level contourlet transform is applied to get feature vector. The textural features are computed and concatenated with coefficients of contourlet transform to form the final feature vector. A two level contourlet transform is applied to get feature vector after the signature images of both query and database are preprocessed for noise removal. The classification results are computed using HTK tool with HMM classifier. The experimental results are computed using GPDS-960 database images to get the parameters like False Rejection Rate (FRR, False Acceptance Rate (FAR and Total Success Rate (TSR. The results show that the values of FRR and FAR are improved compared to the existing algorithm.
Verification of the computational dosimetry system in JAERI (JCDS) for boron neutron capture therapy
Kumada, H.; Yamamoto, K.; Matsumura, A.; Yamamoto, T.; Nakagawa, Y.; Nakai, K.; Kageji, T.
2004-08-01
Clinical trials for boron neutron capture therapy (BNCT) by using the medical irradiation facility installed in Japan Research Reactor No. 4 (JRR-4) at Japan Atomic Energy Research Institute (JAERI) have been performed since 1999. To carry out the BNCT procedure based on proper treatment planning and its precise implementation, the JAERI computational dosimetry system (JCDS) which is applicable to dose planning has been developed in JAERI. The aim of this study was to verify the performance of JCDS. The experimental data with a cylindrical water phantom were compared with the calculation results using JCDS. Data of measurements obtained from IOBNCT cases at JRR-4 were also compared with retrospective evaluation data with JCDS. In comparison with phantom experiments, the calculations and the measurements for thermal neutron flux and gamma-ray dose were in a good agreement, except at the surface of the phantom. Against the measurements of clinical cases, the discrepancy of JCDS's calculations was approximately 10%. These basic and clinical verifications demonstrated that JCDS has enough performance for the BNCT dosimetry. Further investigations are recommended for precise dose distribution and faster calculation environment.
Directory of Open Access Journals (Sweden)
Kärenlampi Sirpa O
2007-02-01
Full Text Available Abstract Background Strawberry (Fragaria × ananassa is an economically important soft fruit crop with polyploid genome which complicates the breeding of new cultivars. For certain traits, genetic engineering offers a potential alternative to traditional breeding. However, many strawberry varieties are quite recalcitrant for Agrobacterium-mediated transformation, and a method allowing easy handling of large amounts of starting material is needed. Also the genotyping of putative transformants is challenging since the isolation of DNA for Southern analysis is difficult due to the high amount of phenolic compounds and polysaccharides that complicate efficient extraction of digestable DNA. There is thus a need to apply a screening method that is sensitive and unambiguous in identifying the different transformation events. Results Hygromycin-resistant strawberries were developed in temporary immersion bioreactors by Agrobacterium-mediated gene transfer. Putative transformants were screened by TAIL-PCR to verify T-DNA integration and to distinguish between the individual transformation events. Several different types of border sequence arrangements were detected. Conclusion This study demonstrates that temporary immersion bioreactor system suits well for the regeneration of transgenic strawberry plants as a labour-efficient technique. Small amount of DNA required by TAIL-PCR is easily recovered even from a small transformant, which allows rapid verification of T-DNA integration and detection of separate gene transfer events. These techniques combined clearly facilitate the generation of transgenic strawberries but should be applicable to other plants as well.
Design Concepts of Re-verification System on the MACSTOR/KN-400
Energy Technology Data Exchange (ETDEWEB)
Ahn, Gil Hoon; Park, Il Jin; Shin, Dong Hoon; Koh, Moon Sung [Korea Institute of Nuclear nonproliferation And Control, Daejeon (Korea, Republic of)
2009-05-15
The MACSTOR/KN-400 module is based on the MACSTOR- 200 design but has twice the capacity and thus twice the number of storage cylinders. In all, the new module contains 40 dry fuel storage cylinders, each of which houses 10 spent fuel baskets. The storage cylinders are arranged in 4 rows of 10, with 24 located close to periphery of the module and 16 located internally at some distance from the peripheral walls. Re-verification is an IAEA safeguard requirement to measure the gamma dose rate and spectrum of each irradiated fuel basket once the storage cylinders are filled with spent fuel. This is required to monitor the presence of spent fuel in the storage cylinders. To achieve this on the existing MACSTOR-200, a reverification tube, running inside the module walls, is provided for each storage cylinder. The gamma profile is read by lowering a detector inside the tube so that it can be registered at the level of each basket. For the 24 peripheral storage cylinders this method of measurement is retained on the MACSTOR/KN-400 module. However, an alternate method is required for the 16 internal dry fuel storage cylinders since they are located some distance from the module walls and thus surrounded by storage cylinders. The focus of this paper is to describe a new reverification system that can be used to measure the gamma profile of each cylinder.
Development of An Automatic Verification Program for Thermal-hydraulic System Codes
Energy Technology Data Exchange (ETDEWEB)
Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)
2012-05-15
As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)
Energy Technology Data Exchange (ETDEWEB)
Blazy-Aubignac, L
2007-09-15
The treatment planning systems (T.P.S.) occupy a key position in the radiotherapy service: they realize the projected calculation of the dose distribution and the treatment duration. Traditionally, the quality control of the calculated distribution doses relies on their comparisons with dose distributions measured under the device of treatment. This thesis proposes to substitute these dosimetry measures to the profile of reference dosimetry calculations got by the Penelope Monte-Carlo code. The Monte-Carlo simulations give a broad choice of test configurations and allow to envisage a quality control of dosimetry aspects of T.P.S. without monopolizing the treatment devices. This quality control, based on the Monte-Carlo simulations has been tested on a clinical T.P.S. and has allowed to simplify the quality procedures of the T.P.S.. This quality control, in depth, more precise and simpler to implement could be generalized to every center of radiotherapy. (N.C.)
Energy Technology Data Exchange (ETDEWEB)
Muller, A.
1994-11-01
It is shown that elementary Quantum Mechanics, applied to the K{sup 0} K-bar{sup 0} system, predicts peculiar long range EPR correlations. Possible experimental verifications are discussed, and a concrete experiment with anti-protons annihilations at rest is proposed. A pedestrian approach to local models shows that K{sup 0} K-bar{sup 0} experimentation could provide arguments to the local realism versus quantum theory controversy. (author). 17 refs., 23 figs.
Energy Technology Data Exchange (ETDEWEB)
H. B. HUNT; D. J. ROSENKRANTS; ET AL
2001-03-01
We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (i) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (ii) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (iii) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPS, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly-specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for
Energy Technology Data Exchange (ETDEWEB)
Hunt, H. B. (Harry B.); Rosenkrantz, D. J. (Daniel J.); Barrett, C. L. (Christopher L.); Marathe, M. V. (Madhav V.); Ravi, S. S. (Sekharipuram S.)
2001-01-01
We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (1) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (2) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (3) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPs, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for
Energy Technology Data Exchange (ETDEWEB)
Valentine, T.; Perez, R. [Oak Ridge National Lab., TN (United States); Rugama, Y.; Munoz-Cobo, J.L. [Poly. Tech. Univ. of Valencia (Spain). Chemical and Nuclear Engineering Dept.
2001-07-01
The design of reactivity monitoring systems for accelerator-driven systems must be investigated to ensure that such systems remain subcritical during operation. The Monte Carlo codes LAHET and MCNP-DSP were combined together to facilitate the design of reactivity monitoring systems. The coupling of LAHET and MCNP-DSP provides a tool that can be used to simulate a variety of subcritical measurements such as the pulsed neutron, Rossi-{alpha}, or noise analysis measurements. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Valentine, T.E.; Rugama, Y. Munoz-Cobos, J.; Perez, R.
2000-10-23
The design of reactivity monitoring systems for accelerator-driven systems must be investigated to ensure that such systems remain subcritical during operation. The Monte Carlo codes LAHET and MCNP-DSP were combined together to facilitate the design of reactivity monitoring systems. The coupling of LAHET and MCNP-DSP provides a tool that can be used to simulate a variety of subcritical measurements such as the pulsed neutron, Rossi-{alpha}, or noise analysis measurements.
Energy Technology Data Exchange (ETDEWEB)
Mendonca, Pedro Henrique; Costa, Marcelo M. da; Dahlke, Diogo B.; Ikeda, Minoru [LACTEC - Instituto de Tecnologia para o Desenvolvimento, Curitiba, PR (Brazil)], Emails: pedro.henrique@lactec.org.br, arinos@lactec.org.br, diogo@lactec.org.br, minoru@lactec.org.br, Celso.melo@copel.com; Carvalho, Joao Claudio D. de [ELETRONORTE, Belem, PR (Brazil)], E-mail: marcelo.melo@eln.gov.br; Teixeira Junior, Jose Arinos [ELETROSUL, Florianopolis, SC (Brazil)], E-mail: jclaudio@eletrosul.gov.br; Melo, Celso F. [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)], E-mail: Celso.melo@copel.com
2009-07-01
This work presents an alternative proposal to the execute the calibration of conventional current transformer at the field, using a verification system composed by a optical current transformer as a reference standard, able to installation in extra high voltage bars.
Implementation of the probability table method in a continuous-energy Monte Carlo code system
Energy Technology Data Exchange (ETDEWEB)
Sutton, T.M.; Brown, F.B. [Lockheed Martin Corp., Schenectady, NY (United States)
1998-10-01
RACER is a particle-transport Monte Carlo code that utilizes a continuous-energy treatment for neutrons and neutron cross section data. Until recently, neutron cross sections in the unresolved resonance range (URR) have been treated in RACER using smooth, dilute-average representations. This paper describes how RACER has been modified to use probability tables to treat cross sections in the URR, and the computer codes that have been developed to compute the tables from the unresolved resonance parameters contained in ENDF/B data files. A companion paper presents results of Monte Carlo calculations that demonstrate the effect of the use of probability tables versus the use of dilute-average cross sections for the URR. The next section provides a brief review of the probability table method as implemented in the RACER system. The production of the probability tables for use by RACER takes place in two steps. The first step is the generation of probability tables from the nuclear parameters contained in the ENDF/B data files. This step, and the code written to perform it, are described in Section 3. The tables produced are at energy points determined by the ENDF/B parameters and/or accuracy considerations. The tables actually used in the RACER calculations are obtained in the second step from those produced in the first. These tables are generated at energy points specific to the RACER calculation. Section 4 describes this step and the code written to implement it, as well as modifications made to RACER to enable it to use the tables. Finally, some results and conclusions are presented in Section 5.
Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk
2011-01-01
A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...
Energy Technology Data Exchange (ETDEWEB)
Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)
2016-05-15
Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.
Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...
Institute of Scientific and Technical Information of China (English)
Jiang Wei; Xiang Haige
2004-01-01
This paper addresses the issues of channel estimation in a Multiple-Input/Multiple-Output (MIMO) system. Markov Chain Monte Carlo (MCMC) method is employed to jointly estimate the Channel State Information (CSI) and the transmitted signals. The deduced algorithms can work well under circumstances of low Signal-to-Noise Ratio (SNR). Simulation results are presented to demonstrate their effectiveness.
ROESSEL, ROBERT A., JR.
THE FIRST SECTION OF THIS BOOK COVERS THE HISTORICAL AND CULTURAL BACKGROUND OF THE SAN CARLOS APACHE INDIANS, AS WELL AS AN HISTORICAL SKETCH OF THE DEVELOPMENT OF THEIR FORMAL EDUCATIONAL SYSTEM. THE SECOND SECTION IS DEVOTED TO THE PROBLEMS OF TEACHERS OF THE INDIAN CHILDREN IN GLOBE AND SAN CARLOS, ARIZONA. IT IS DIVIDED INTO THREE PARTS--(1)…
Determination of phase equilibria in confined systems by open pore cell Monte Carlo method.
Miyahara, Minoru T; Tanaka, Hideki
2013-02-28
We present a modification of the molecular dynamics simulation method with a unit pore cell with imaginary gas phase [M. Miyahara, T. Yoshioka, and M. Okazaki, J. Chem. Phys. 106, 8124 (1997)] designed for determination of phase equilibria in nanopores. This new method is based on a Monte Carlo technique and it combines the pore cell, opened to the imaginary gas phase (open pore cell), with a gas cell to measure the equilibrium chemical potential of the confined system. The most striking feature of our new method is that the confined system is steadily led to a thermodynamically stable state by forming concave menisci in the open pore cell. This feature of the open pore cell makes it possible to obtain the equilibrium chemical potential with only a single simulation run, unlike existing simulation methods, which need a number of additional runs. We apply the method to evaluate the equilibrium chemical potentials of confined nitrogen in carbon slit pores and silica cylindrical pores at 77 K, and show that the results are in good agreement with those obtained by two conventional thermodynamic integration methods. Moreover, we also show that the proposed method can be particularly useful for determining vapor-liquid and vapor-solid coexistence curves and the triple point of the confined system.
Directory of Open Access Journals (Sweden)
Nadeem AKHTAR
2014-12-01
Full Text Available This paper presents an approach based on the analysis, design, and formal verification of a multi-agent based university Information Management System (IMS. University IMS accesses information, creates reports and facilitates teachers as well as students. An orchestrator agent manages the coordination between all agents. It also manages the database connectivity for the whole system. The proposed IMS is based on BDI agent architecture, which models the system based on belief, desire, and intentions. The correctness properties of safety and liveness are specified by First-order predicate logic.
Mokhov, Serguei A
2009-01-01
This paper introduces a novel concept of self-forensics to complement the standard autonomic self-CHOP properties of the self-managed systems, to be specified in the Forensic Lucid language. We argue that self-forensics, with the forensics taken out of the cybercrime domain, is applicable to "self-dissection" for the purpose of verification of autonomous software and hardware systems of flight-critical systems for automated incident and anomaly analysis and event reconstruction by the engineering teams in a variety of incident scenarios during design and testing as well as actual flight data.
Directory of Open Access Journals (Sweden)
Antonio da Silva
2014-01-01
Full Text Available This paper presents the design of a SystemC transaction level modelling wrapping library that can be used for the assertion of system properties, protocol compliance, or fault injection. The library uses C++ virtual table hooks as a dynamic binary instrumentation technique to inline wrappers in the TLM2 transaction path. This technique can be applied after the elaboration phase and needs neither source code modifications nor recompilation of the top level SystemC modules. The proposed technique has been successfully applied to the robustness verification of the on-board boot software of the Instrument Control Unit of the Solar Orbiter’s Energetic Particle Detector.
Energy Technology Data Exchange (ETDEWEB)
Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Y; Yang, J; Liu, H [Cangzhou People' s Hospital, Cangzhou, Hebei (China); Liu, D [The Fourth Hospital of Hebei Medical University, Shijiazhuang, Hebei (China)
2014-06-01
Purpose: The purpose of this work is to compare the verification results of three solutions (2D/3D ionization chamber arrays measurement and Monte Carlo simulation), the results will help make a clinical decision as how to do our cervical IMRT verification. Methods: Seven cervical cases were planned with Pinnacle 8.0m to meet the clinical acceptance criteria. The plans were recalculated in the Matrixx and Delta4 phantom with the accurate plans parameters. The plans were also recalculated by Monte Carlo using leaf sequences and MUs for individual plans of every patient, Matrixx and Delta4 phantom. All plans of Matrixx and Delta4 phantom were delivered and measured. The dose distribution of iso slice, dose profiles, gamma maps of every beam were used to evaluate the agreement. Dose-volume histograms were also compared. Results: The dose distribution of iso slice and dose profiles from Pinnacle calculation were in agreement with the Monte Carlo simulation, Matrixx and Delta4 measurement. A 95.2%/91.3% gamma pass ratio was obtained between the Matrixx/Delta4 measurement and Pinnacle distributions within 3mm/3% gamma criteria. A 96.4%/95.6% gamma pass ratio was obtained between the Matrixx/Delta4 measurement and Monte Carlo simulation within 2mm/2% gamma criteria, almost 100% gamma pass ratio within 3mm/3% gamma criteria. The DVH plot have slightly differences between Pinnacle and Delta4 measurement as well as Pinnacle and Monte Carlo simulation, but have excellent agreement between Delta4 measurement and Monte Carlo simulation. Conclusion: It was shown that Matrixx/Delta4 and Monte Carlo simulation can be used very efficiently to verify cervical IMRT delivery. In terms of Gamma value the pass ratio of Matrixx was little higher, however, Delta4 showed more problem fields. The primary advantage of Delta4 is the fact it can measure true 3D dosimetry while Monte Carlo can simulate in patients CT images but not in phantom.
Energy Technology Data Exchange (ETDEWEB)
Granero, D.; Blasco, J. M.; Sanchis, E.; Gonzalez, V.; Martin, J. D.; Ballester, F.; Sanchis, E.
2013-07-01
The purpose of this work is to test the response of a system composed of 21 scintillators radiation fibres and its electronics as proof of the validity of the System. For this it has radiated test system with a source of verification of Sr-90. In addition, performed Monte Carlo simulations of the system by comparing the results of the simulations with those obtained experimentally. Moreover taken an approximation to the behavior of a hodoscopic composed of 100 scintillators, transverse fibers between if, in proton therapy, conducting different Monte Carlo simulations. (Author)
Monte Carlo simulations of morphological transitions in PbTe/CdTe immiscible material systems
Mińkowski, Marcin; Załuska-Kotur, Magdalena A.; Turski, Łukasz A.; Karczewski, Grzegorz
2016-09-01
The crystal growth of the immiscible PbTe/CdTe multilayer system is analyzed as an example of a self-organizing process. The immiscibility of the constituents leads to the observed morphological transformations such as an anisotropy driven formation of quantum dots and nanowires and to a phase separation at the highest temperatures. The proposed model accomplishes a bulk and surface diffusion together with an anisotropic mobility of the material components. We analyze its properties by kinetic Monte Carlo simulations and show that it is able to reproduce all of the structures observed experimentally during the process of the PbTe/CdTe growth. We show that all of the dynamical processes studied play an important role in the creation of zero-, one-, two-, and, finally, three-dimensional structures. The shape of the structures that are grown is different for relatively thick multilayers, when the bulk diffusion cooperates with the anisotropic mobility, as compared to the annealed structures for which only the isotropic bulk diffusion decides about the process. Finally, it is different again for thin multilayers when the surface diffusion is the most decisive factor. We compare our results with the experimentally grown systems and show that the proposed model explains the diversity of observed structures.
Kinetic Monte Carlo and cellular particle dynamics simulations of multicellular systems
Flenner, Elijah; Janosi, Lorant; Barz, Bogdan; Neagu, Adrian; Forgacs, Gabor; Kosztin, Ioan
2012-03-01
Computer modeling of multicellular systems has been a valuable tool for interpreting and guiding in vitro experiments relevant to embryonic morphogenesis, tumor growth, angiogenesis and, lately, structure formation following the printing of cell aggregates as bioink particles. Here we formulate two computer simulation methods: (1) a kinetic Monte Carlo (KMC) and (2) a cellular particle dynamics (CPD) method, which are capable of describing and predicting the shape evolution in time of three-dimensional multicellular systems during their biomechanical relaxation. Our work is motivated by the need of developing quantitative methods for optimizing postprinting structure formation in bioprinting-assisted tissue engineering. The KMC and CPD model parameters are determined and calibrated by using an original computational-theoretical-experimental framework applied to the fusion of two spherical cell aggregates. The two methods are used to predict the (1) formation of a toroidal structure through fusion of spherical aggregates and (2) cell sorting within an aggregate formed by two types of cells with different adhesivities.
USING PERFLUOROCARBON TRACERS FOR VERIFICATION OF CAP AND COVER SYSTEMS PERFORMANCE.
Energy Technology Data Exchange (ETDEWEB)
HEISER,J.; SULLIVAN,T.
2001-11-01
The Department of Energy (DOE) Environmental Management (EM) office has committed itself to an accelerated cleanup of its national facilities. The goal is to have much of the DOE legacy waste sites remediated by 2006. This includes closure of several sites (e.g., Rocky Flats and Fernald). With the increased focus on accelerated cleanup, there has been considerable concern about long-term stewardship issues in general, and verification and long-term monitoring (LTM) of caps and covers, in particular. Cap and cover systems (covers) are vital remedial options that will be extensively used in meeting these 2006 cleanup goals. Every buried waste site within the DOE complex will require some form of cover system. These covers are expected to last from 100 to 1000 years or more. The stakeholders can be expected to focus on system durability and sustained performance. DOE EM has set up a national committee of experts to develop a long-term capping (LTC) guidance document. Covers are subject to subsidence, erosion, desiccation, animal intrusion, plant root infiltration, etc., all of which will affect the overall performance of the cover. Very little is available in terms of long-term monitoring other than downstream groundwater or surface water monitoring. By its very nature, this can only indicate that failure of the cover system has already occurred and contaminants have been transported away from the site. This is unacceptable. Methods that indicate early cover failure (prior to contaminant release) or predict approaching cover failure are needed. The LTC committee has identified predictive monitoring technologies as a high priority need for DOE, both for new covers as well as existing covers. The same committee identified a Brookhaven National Laboratory (BNL) technology as one approach that may be capable of meeting the requirements for LTM. The Environmental Research and Technology Division (ERTD) at BNL developed a novel methodology for verifying and monitoring
Nievaart, V. A.; Daquino, G. G.; Moss, R. L.
2007-06-01
Boron Neutron Capture Therapy (BNCT) is a bimodal form of radiotherapy for the treatment of tumour lesions. Since the cancer cells in the treatment volume are targeted with 10B, a higher dose is given to these cancer cells due to the 10B(n,α)7Li reaction, in comparison with the surrounding healthy cells. In Petten (The Netherlands), at the High Flux Reactor, a specially tailored neutron beam has been designed and installed. Over 30 patients have been treated with BNCT in 2 clinical protocols: a phase I study for the treatment of glioblastoma multiforme and a phase II study on the treatment of malignant melanoma. Furthermore, activities concerning the extra-corporal treatment of metastasis in the liver (from colorectal cancer) are in progress. The irradiation beam at the HFR contains both neutrons and gammas that, together with the complex geometries of both patient and beam set-up, demands for very detailed treatment planning calculations. A well designed Treatment Planning System (TPS) should obey the following general scheme: (1) a pre-processing phase (CT and/or MRI scans to create the geometric solid model, cross-section files for neutrons and/or gammas); (2) calculations (3D radiation transport, estimation of neutron and gamma fluences, macroscopic and microscopic dose); (3) post-processing phase (displaying of the results, iso-doses and -fluences). Treatment planning in BNCT is performed making use of Monte Carlo codes incorporated in a framework, which includes also the pre- and post-processing phases. In particular, the glioblastoma multiforme protocol used BNCT_rtpe, while the melanoma metastases protocol uses NCTPlan. In addition, an ad hoc Positron Emission Tomography (PET) based treatment planning system (BDTPS) has been implemented in order to integrate the real macroscopic boron distribution obtained from PET scanning. BDTPS is patented and uses MCNP as the calculation engine. The precision obtained by the Monte Carlo based TPSs exploited at Petten
Experimental verification of the steady-state behavior of a beam system with discontinuous support
Vorst, E.L.B. van de; Assinck, F.H.; Kraker, A. de; Fey, R.H.B.; Campen, D.H. van
1996-01-01
This article deals with the experimental verification of the long-term behavior of a periodically excited linear beam supported by a one-sided spring. Numerical analysis of the beam showed subharmonic, quasi-periodic, and chaotic behavior. Further, three different routes leading to chaos were found.
A Particle System for Safety Verification of Free Flight in Air Traffic
Blom, H.A.P.; Krystul, J.; Bakker, G.J.
2006-01-01
Under free flight, an aircrew has both the freedom to select their trajectory and the responsibility of resolving conflicts with other aircraft. The general belief is that free flight can be made safe under low traffic conditions. Increasing traffic, however, raises safety verification issues. This
Verification of COMDES-II Systems Using UPPAAL with Model Transformation
DEFF Research Database (Denmark)
Xu, Ke; Pettersson, Paul; Sierszecki, Krzysztof
2008-01-01
in a timed multitasking environment, modal continuous operation combining reactive control behavior with continuous data processing, etc., by following the principle of separation-of-concerns. In the paper we present a transformational approach to the formal verification of both timing and reactive behaviors...
Verification of Large State/Event Systems using Compositionality and Dependency Analysis
DEFF Research Database (Denmark)
Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik;
2001-01-01
possible automated verification of large industrial designs with the use of only modest resources (less than 5 minutes on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool visualSTATETM....
The Environmental Technology Verification report discusses the technology and performance of the AFP30 air filter for dust and bioaerosol filtration manufactured by Airflow Products. The pressure drop across the filter was 62 Pa clean and 247 Pa dust loaded. The filtration effici...
Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems
Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio
2006-01-01
This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops w
1981-04-30
approach during R ,ET development is required during the verification effort. The approach used for verifying the MOM 3FER to prepare for TD . X was to...the value resident in TRACK NR is equal to the value resident in TRACK NR ’N. The portion of the VMH requirement described above requires tnat the
Energy Technology Data Exchange (ETDEWEB)
Andrews, A. (Argonne National Lab., IL (USA). Energy Systems Div.); Formento, J.W.; Hill, L.G.; Riemer, C.A. (Argonne National Lab., IL (USA). Environmental Assessment and Information Sciences Div.)
1990-01-01
Argonne National Laboratory (ANL) studied the role of verification, validation, and testing (VV T) in the Department of Veterans Affairs (VA) automated data processing (ADP) system development life cycle (SDLC). In this study, ANL reviewed and compared standard VV T practices in the private and government sectors with those in the VA. The methodology included extensive interviews with, and surveys of, users, analysts, and staff in the Systems Development Division (SDD) and Systems Verification and Testing Division (SV TD) of the VA, as well as representatives of private and government organizations, and a review of ADP standards. The study revealed that VA's approach to VV T already incorporates some industry practices -- in particular, the use of an independent organization that relies on the acceptability of test results to validate a software system. Argonne recommends that the role of SV TD be limited to validation and acceptance testing (defined as formal testing conducted independently to determine whether a software system satisfies its acceptance criteria). It also recommends that the role of the SDD be expanded to include verification testing (defined as formal testing or revaluation conducted by the developer to determine whether a software development satisfies design criteria). Integrated systems testing should be performed by Operations in a production-like environment under stressful situations to assess how trouble-free and acceptable the software is to the end user. A separate, independent, quality assurance group should be responsible for ADP auditing and for helping to establish policies for managing software configurations and should report directly to the VA central office. Finally, and of no less importance, an in-house training program and procedures manual should be instituted for the entire SDLC for all involved staff; it should incorporate or reference ADP standards.
On-the-fly nuclear data processing methods for Monte Carlo simulations of fast spectrum systems
Energy Technology Data Exchange (ETDEWEB)
Walsh, Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-08-31
The presentation summarizes work performed over summer 2015 related to Monte Carlo simulations. A flexible probability table interpolation scheme has been implemented and tested with results comparing favorably to the continuous phase-space on-the-fly approach.
Systems, methods and apparatus for verification of knowledge-based systems
Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.
Architecture Support for Runtime Integration and Verification of Component-based Systems of Systems
Gonzalez, A.; Piel, E.; Gross, H.G.
2008-01-01
Preprint of paper published in: ASE 2008 - 23rd IEEE/ACM International Conference on Automated Software Engineering, 15-19 September 2008; doi:10.1109/ASEW.2008.4686292 Systems-of-Systems (SoS) represent a novel kind of system, for which runtime evolution is a key requirement, as components join an
Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William
2013-04-30
Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC=Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC=Chem has been shown to be capable of running at the petascale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exascale platforms with a comparable level of efficiency is expected to be feasible.
A spectral analysis of the domain decomposed Monte Carlo method for linear systems
Energy Technology Data Exchange (ETDEWEB)
Slattery, Stuart R., E-mail: slatterysr@ornl.gov [Oak Ridge National Laboratory, 1 Bethel Valley Road, Oak Ridge, TN 37831 (United States); Evans, Thomas M., E-mail: evanstm@ornl.gov [Oak Ridge National Laboratory, 1 Bethel Valley Road, Oak Ridge, TN 37831 (United States); Wilson, Paul P.H., E-mail: wilsonp@engr.wisc.edu [University of Wisconsin - Madison, 1500 Engineering Dr., Madison, WI 53706 (United States)
2015-12-15
The domain decomposed behavior of the adjoint Neumann-Ulam Monte Carlo method for solving linear systems is analyzed using the spectral properties of the linear operator. Relationships for the average length of the adjoint random walks, a measure of convergence speed and serial performance, are made with respect to the eigenvalues of the linear operator. In addition, relationships for the effective optical thickness of a domain in the decomposition are presented based on the spectral analysis and diffusion theory. Using the effective optical thickness, the Wigner rational approximation and the mean chord approximation are applied to estimate the leakage fraction of random walks from a domain in the decomposition as a measure of parallel performance and potential communication costs. The one-speed, two-dimensional neutron diffusion equation is used as a model problem in numerical experiments to test the models for symmetric operators with spectral qualities similar to light water reactor problems. In general, the derived approximations show good agreement with random walk lengths and leakage fractions computed by the numerical experiments.
Lavalle, Catia; Rigol, Marcos; Muramatsu, Alejandro
2005-08-01
The cover picture of the current issue, taken from the Feature Article [1], depicts the evolution of local density (a) and its quantum fluctuations (b) in trapped fermions on one-dimensional optical lattices. As the number of fermions in the trap is increased, figure (a) shows the formation of a Mott-insulating plateau (local density equal to one) whereas the quantum fluctuations - see figure (b) - are strongly suppressed, but nonzero. For a larger number of fermions new insulating plateaus appear (this time with local density equal to two), but no density fluctuations. Regions with non-constant density are metallic and exhibit large quantum fluctuations of the density.The first author Catia Lavalle is a Postdoc at the University of Stuttgart. She works in the field of strongly correlated quantum systems by means of Quantum Monte Carlo methods (QMC). While working on her PhD thesis at the University of Stuttgart, she developed a new QMC technique that allows to study dynamical properties of the t-J model.
Zhang, Zhigang; Duan, Zhenhao
2002-10-01
A new technique of temperature scaling method combined with the conventional Gibbs Ensemble Monte Carlo simulation was used to study liquid-vapor phase equilibria of the methane-ethane (CH 4-C 2H 6) system. With this efficient method, a new set of united-atom Lennard-Jones potential parameters for pure C 2H 6 was found to be more accurate than those of previous models in the prediction of phase equilibria. Using the optimized potentials for liquid simulations (OPLS) potential for CH 4 and the potential of this study for C 2H 6, together with a simple mixing rule, we simulated the equilibrium compositions and densities of the CH 4-C 2H 6 mixtures with accuracy close to experiments. The simulated data are supplements to experiments, and may cover a larger temperature-pressure-composition space than experiments. Compared with some well-established equations of state such as Peng-Robinson equation of state (PR-EQS), the simulated results are found to be closer to experiments, at least in some temperature and pressure ranges.
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
Broers, H.; Hemken, H.; Luhmann, T.; Ritschl, P.
For the total replacement of the knee joint, the precise reconstruction of the mechanical axis is significantly determined by the alignment of the cutting tool with respect to the rotation centre of the femur head. Operation techniques supported by navigation allow for the precise three-dimensional location of the hip centre by cinematic analysis. Recent results permit the reconstruction of the femur axis to be better than 0.7°. Therefore, conventional verification methods such as the post-operative recording of the complete leg are not suitable due to their limited system accuracy of about 2°. As the femur head cannot be accessed directly during the operation, an X-ray method has been used to verify alignment. The paper presents a method and the results achieved for the calibration of a C-arm system by introducing photogrammetric parameters. Since the method is used during operation, boundary conditions such as minimal invasive surgical intervention and sterility have been considered for practical applications of patients.
Validation of a Monte Carlo simulation of the Philips Allegro/GEMINI PET systems using GATE
Energy Technology Data Exchange (ETDEWEB)
Lamare, F; Turzo, A; Bizais, Y; Rest, C Cheze Le; Visvikis, D [U650 INSERM, Laboratoire du Traitement de l' information medicale (LaTIM), CHU Morvan, Universite de Bretagne Occidentale, Brest, 29609 (France)
2006-02-21
A newly developed simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop a Monte Carlo simulation of a fully three-dimensional (3D) clinical PET scanner. The Philips Allegro/GEMINI PET systems were simulated in order to (a) allow a detailed study of the parameters affecting the system's performance under various imaging conditions, (b) study the optimization and quantitative accuracy of emission acquisition protocols for dynamic and static imaging, and (c) further validate the potential of GATE for the simulation of clinical PET systems. A model of the detection system and its geometry was developed. The accuracy of the developed detection model was tested through the comparison of simulated and measured results obtained with the Allegro/GEMINI systems for a number of NEMA NU2-2001 performance protocols including spatial resolution, sensitivity and scatter fraction. In addition, an approximate model of the system's dead time at the level of detected single events and coincidences was developed in an attempt to simulate the count rate related performance characteristics of the scanner. The developed dead-time model was assessed under different imaging conditions using the count rate loss and noise equivalent count rates performance protocols of standard and modified NEMA NU2-2001 (whole body imaging conditions) and NEMA NU2-1994 (brain imaging conditions) comparing simulated with experimental measurements obtained with the Allegro/GEMINI PET systems. Finally, a reconstructed image quality protocol was used to assess the overall performance of the developed model. An agreement of <3% was obtained in scatter fraction, with a difference between 4% and 10% in the true and random coincidence count rates respectively, throughout a range of activity concentrations and under various imaging conditions, resulting in <8% differences between simulated and measured noise equivalent count rates performance. Finally, the image
Directory of Open Access Journals (Sweden)
Biniam Yohannes Tesfamicael
2014-03-01
Full Text Available Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose to the rectum in proton therapy of prostate cancer.Method: A Geant4 Monte Carlo toolkit was used to simulate the proton therapy of prostate cancer, with an endorectal balloon and a set of scintillating fibers for immobilization and dosimetry measurements, respectively.Results: A linear response of the fibers to the dose delivered was observed to within less than 2%. Results obtained show that fibers close to the prostate recorded higher dose, with the closest fiber recording about one-third of the dose to the target. A 1/r2 (r is defined as center-to-center distance between the prostate and the fibers decrease was observed as one goes toward the frontal and distal regions. A very low dose was recorded by the fibers beneath the balloon which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis showed a relatively linear relationship between the dose to the target and the dose to the top fibers (total 17, with a slope of (-0.07 ± 0.07 at large number of events per degree of rotation of the modulator wheel (i.e., dose.Conclusion: Thin (1 mm × 1 mm, long (1 m scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum during proton therapy of prostate cancer. The linear response of the fibers to the dose delivered makes them good candidates as dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.-----------------------------------Cite this article as: Tesfamicael BY, Avery S, Gueye P, Lyons D, Mahesh M. Scintillating fiber based in-vivo dose monitoring system to the rectum in proton therapy of prostate cancer: A Geant4 Monte Carlo
Open verification methodology cookbook
Glasser, Mark
2009-01-01
Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
my parents Lia and Daniel and to my sister Daniela , for the help and support through- out my whole life. Even though they have not been present during...transient overload, scheduling of aperiodic tasks and priority granularity in communication scheduling [49]. For this rea - son, static scheduling algorithms...Quantitative temporal rea - soning. In Lecture Notes in Computer Science, Computer-Aided Verification. Springer- Verlag, 1990. [34] J. Fernandez, H
The Overview of System Maintainability Verification%系统维修性验证概述
Institute of Scientific and Technical Information of China (English)
钱潜; 单志伟; 刘福胜
2015-01-01
在分析系统维修性验证概念模型的基础上，论述国内外研究现状，分析了当前主要的维修性验证方法和维修性样本获取方法，并重点说明了虚拟仿真技术在维修性验证中的应用，明确了现阶段研究存在的不足，强调了进行更深入研究的价值。%In the base of analyzing the maintainability verification concept model, this article discussed the research status at home and abroad, analyzed the validation methods and learning samples of the maintainability verification, and focused on the virtual simulation technology in maintenance used in the verification, cleared the shortage of the status of research, emphasized the value of further research.
Monte Carlo Simulations of Random Frustrated Systems on Graphics Processing Units
Feng, Sheng; Fang, Ye; Hall, Sean; Papke, Ariane; Thomasson, Cade; Tam, Ka-Ming; Moreno, Juana; Jarrell, Mark
2012-02-01
We study the implementation of the classical Monte Carlo simulation for random frustrated models using the multithreaded computing environment provided by the the Compute Unified Device Architecture (CUDA) on modern Graphics Processing Units (GPU) with hundreds of cores and high memory bandwidth. The key for optimizing the performance of the GPU computing is in the proper handling of the data structure. Utilizing the multi-spin coding, we obtain an efficient GPU implementation of the parallel tempering Monte Carlo simulation for the Edwards-Anderson spin glass model. In the typical simulations, we find over two thousand times of speed-up over the single threaded CPU implementation.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
Nuclear disarmament verification
Energy Technology Data Exchange (ETDEWEB)
DeVolpi, A.
1993-12-31
Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.
The Monte Carlo Method and the Evaluation of Retrieval System Performance.
Burgin, Robert
1999-01-01
Introduces the Monte Carlo method which is shown to represent an attractive alternative to the hypergeometric model for identifying the levels at which random retrieval performance is exceeded in retrieval test collections and for overcoming some of the limitations of the hypergeometric model. Practical matters to consider when employing the Monte…
Directory of Open Access Journals (Sweden)
Lutsenko Y. V.
2015-11-01
Full Text Available In this article, in accordance with the methodology of the Automated system-cognitive analysis (ASCanalysis, we examine the implementation of the 3rd ASC-analysis: synthesis and verification of forecasting models of development of diversified agro-industrial corporations. In this step, we have synthesis and verification of 3 statistical and 7 system-cognitive models: ABS – matrix of the absolute frequencies, PRC1 and PRC2 – matrix of the conditional and unconditional distributions, INF1 and INF2 private criterion: the amount of knowledge based on A. Kharkevich, INF3 – private criterion: the Chi-square test: difference between the actual and the theoretically expected absolute frequencies INF4 and INF5 – private criterion: ROI - Return On Investment, INF6 and INF7 – private criterion: the difference between conditional and unconditional probability (coefficient of relationship. The reliability of the created models was assessed in accordance with the proposed metric is similar to the known F-test, but does not involve the performance of normal distribution, linearity of the object modeling, the independence and additivity acting factors. The accuracy of the obtained models was high enough to resolve the subsequent problems of identification, forecasting and decision making, as well as studies of the modeled object by studying its model, scheduled for consideration in future articles
Prytkova, Vera; Heyden, Matthias; Khago, Domarin; Freites, J Alfredo; Butts, Carter T; Martin, Rachel W; Tobias, Douglas J
2016-08-25
We present a novel multi-conformation Monte Carlo simulation method that enables the modeling of protein-protein interactions and aggregation in crowded protein solutions. This approach is relevant to a molecular-scale description of realistic biological environments, including the cytoplasm and the extracellular matrix, which are characterized by high concentrations of biomolecular solutes (e.g., 300-400 mg/mL for proteins and nucleic acids in the cytoplasm of Escherichia coli). Simulation of such environments necessitates the inclusion of a large number of protein molecules. Therefore, computationally inexpensive methods, such as rigid-body Brownian dynamics (BD) or Monte Carlo simulations, can be particularly useful. However, as we demonstrate herein, the rigid-body representation typically employed in simulations of many-protein systems gives rise to certain artifacts in protein-protein interactions. Our approach allows us to incorporate molecular flexibility in Monte Carlo simulations at low computational cost, thereby eliminating ambiguities arising from structure selection in rigid-body simulations. We benchmark and validate the methodology using simulations of hen egg white lysozyme in solution, a well-studied system for which extensive experimental data, including osmotic second virial coefficients, small-angle scattering structure factors, and multiple structures determined by X-ray and neutron crystallography and solution NMR, as well as rigid-body BD simulation results, are available for comparison.
Directory of Open Access Journals (Sweden)
M. Kotbi
2013-03-01
Full Text Available The choice of appropriate interaction models is among the major disadvantages of conventional methods such as Molecular Dynamics (MD and Monte Carlo (MC simulations. On the other hand, the so-called Reverse Monte Carlo (RMC method, based on experimental data, can be applied without any interatomic and/or intermolecular interactions. The RMC results are accompanied by artificial satellite peaks. To remedy this problem, we use an extension of the RMC algorithm, which introduces an energy penalty term into the acceptance criteria. This method is referred to as the Hybrid Reverse Monte Carlo (HRMC method. The idea of this paper is to test the validity of a combined potential model of coulomb and Lennard-Jones in a Fluoride glass system BaMnMF7 (M = Fe,V using HRMC method. The results show a good agreement between experimental and calculated characteristics, as well as a meaningful improvement in partial pair distribution functions (PDFs. We suggest that this model should be used in calculating the structural properties and in describing the average correlations between components of fluoride glass or a similar system. We also suggest that HRMC could be useful as a tool for testing the interaction potential models, as well as for conventional applications.
Energy Technology Data Exchange (ETDEWEB)
Li, JS; Fan, J; Ma, C-M [Fox Chase Cancer Center, Philadelphia, PA (United States)
2015-06-15
Purpose: To improve the treatment efficiency and capabilities for full-body treatment, a robotic radiosurgery system has equipped with a multileaf collimator (MLC) to extend its accuracy and precision to radiation therapy. To model the MLC and include it in the Monte Carlo patient dose calculation is the goal of this work. Methods: The radiation source and the MLC were carefully modeled to consider the effects of the source size, collimator scattering, leaf transmission and leaf end shape. A source model was built based on the output factors, percentage depth dose curves and lateral dose profiles measured in a water phantom. MLC leaf shape, leaf end design and leaf tilt for minimizing the interleaf leakage and their effects on beam fluence and energy spectrum were all considered in the calculation. Transmission/leakage was added to the fluence based on the transmission factors of the leaf and the leaf end. The transmitted photon energy was tuned to consider the beam hardening effects. The calculated results with the Monte Carlo implementation was compared with measurements in homogeneous water phantom and inhomogeneous phantoms with slab lung or bone material for 4 square fields and 9 irregularly shaped fields. Results: The calculated output factors are compared with the measured ones and the difference is within 1% for different field sizes. The calculated dose distributions in the phantoms show good agreement with measurements using diode detector and films. The dose difference is within 2% inside the field and the distance to agreement is within 2mm in the penumbra region. The gamma passing rate is more than 95% with 2%/2mm criteria for all the test cases. Conclusion: Implementation of Monte Carlo dose calculation for a MLC equipped robotic radiosurgery system is completed successfully. The accuracy of Monte Carlo dose calculation with MLC is clinically acceptable. This work was supported by Accuray Inc.
Quantitative Verification in Practice
Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.
2010-01-01
Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities
Quantum Monte Carlo simulation
Wang, Yazhen
2011-01-01
Contemporary scientific studies often rely on the understanding of complex quantum systems via computer simulation. This paper initiates the statistical study of quantum simulation and proposes a Monte Carlo method for estimating analytically intractable quantities. We derive the bias and variance for the proposed Monte Carlo quantum simulation estimator and establish the asymptotic theory for the estimator. The theory is used to design a computational scheme for minimizing the mean square er...
A simulation study of a C-shaped in-beam PET system for dose verification in carbon ion therapy
Jung An, Su; Beak, Cheol-Ha; Lee, Kisung; Hyun Chung, Yong
2013-01-01
The application of hadrons such as carbon ions is being developed for the treatment of cancer. The effectiveness of such a technique is due to the eligibility of charged particles in delivering most of their energy near the end of the range, called the Bragg peak. However, accurate verification of dose delivery is required since misalignment of the hadron beam can cause serious damage to normal tissue. PET scanners can be utilized to track the carbon beam to the tumor by imaging the trail of the hadron-induced positron emitters in the irradiated volume. In this study, we designed and evaluated (through Monte Carlo simulations) an in-beam PET scanner for monitoring patient dose in carbon beam therapy. A C-shaped PET and a partial-ring PET were designed to avoid interference between the PET detectors and the therapeutic carbon beam delivery. Their performance was compared with that of a full-ring PET scanner. The C-shaped, partial-ring, and full-ring scanners consisted of 14, 12, and 16 detector modules, respectively, with a 30.2 cm inner diameter for brain imaging. Each detector module was composed of a 13×13 array of 4.0 mm×4.0 mm×20.0 mm LYSO crystals and four round 25.4 mm diameter PMTs. To estimate the production yield of positron emitters such as 10C, 11C, and 15O, a cylindrical PMMA phantom (diameter, 20 cm; thickness, 20 cm) was irradiated with 170, 290, and 350 AMeV 12C beams using the GATE code. Phantom images of the three types of scanner were evaluated by comparing the longitudinal profile of the positron emitters, measured along the carbon beam as it passed a simulated positron emitter distribution. The results demonstrated that the development of a C-shaped PET scanner to characterize carbon dose distribution for therapy planning is feasible.
Energy Technology Data Exchange (ETDEWEB)
CARTER, R.P.
2000-04-04
DOE Policy 450.4 mandates that safety be integrated into all aspects of the management and operations of its facilities. The goal of an institutionalized Integrated Safety Management System (ISMS) is to have a single integrated system that includes Environment, Safety, and Health requirements in the work planning and execution processes to ensure the protection of the worker, public, environment, and the federal property over the life cycle of the Environmental Restoration (ER) Project. The purpose of this Environmental Restoration Contractor (ERC) ISMS Phase MI Verification was to determine whether ISMS programs and processes were institutionalized within the ER Project, whether these programs and processes were implemented, and whether the system had promoted the development of a safety conscious work culture.
Energy Technology Data Exchange (ETDEWEB)
Riemer, C.A.
1990-05-01
Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.
Shift Verification and Validation
Energy Technology Data Exchange (ETDEWEB)
Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.
Shift Verification and Validation
Energy Technology Data Exchange (ETDEWEB)
Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.
HDL to verification logic translator
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
Catlı, Serap; Tanır, Güneş
2013-01-01
The present study aimed to investigate the effects of titanium, titanium alloy, and stainless steel hip prostheses on dose distribution based on the Monte Carlo simulation method, as well as the accuracy of the Eclipse treatment planning system (TPS) at 6 and 18MV photon energies. In the present study the pencil beam convolution (PBC) method implemented in the Eclipse TPS was compared to the Monte Carlo method and ionization chamber measurements. The present findings show that if high-Z material is used in prosthesis, large dose changes can occur due to scattering. The variance in dose observed in the present study was dependent on material type, density, and atomic number, as well as photon energy; as photon energy increased back scattering decreased. The dose perturbation effect of hip prostheses was significant and could not be predicted accurately by the PBC method for hip prostheses. The findings show that for accurate dose calculation the Monte Carlo-based TPS should be used in patients with hip prostheses.
Energy Technology Data Exchange (ETDEWEB)
Çatlı, Serap, E-mail: serapcatli@hotmail.com [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey); Tanır, Güneş [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey)
2013-10-01
The present study aimed to investigate the effects of titanium, titanium alloy, and stainless steel hip prostheses on dose distribution based on the Monte Carlo simulation method, as well as the accuracy of the Eclipse treatment planning system (TPS) at 6 and 18 MV photon energies. In the present study the pencil beam convolution (PBC) method implemented in the Eclipse TPS was compared to the Monte Carlo method and ionization chamber measurements. The present findings show that if high-Z material is used in prosthesis, large dose changes can occur due to scattering. The variance in dose observed in the present study was dependent on material type, density, and atomic number, as well as photon energy; as photon energy increased back scattering decreased. The dose perturbation effect of hip prostheses was significant and could not be predicted accurately by the PBC method for hip prostheses. The findings show that for accurate dose calculation the Monte Carlo-based TPS should be used in patients with hip prostheses.
Analytical, experimental, and Monte Carlo system response matrix for pinhole SPECT reconstruction
Energy Technology Data Exchange (ETDEWEB)
Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Spain and Grupo de Imaxe Molecular, IDIS, Santiago de Compostela 15706 (Spain); Pino, Francisco [Unitat de Biofísica, Facultat de Medicina, Universitat de Barcelona, Spain and Servei de Física Médica i Protecció Radiológica, Institut Catalá d' Oncologia, Barcelona 08036 (Spain); Silva-Rodríguez, Jesús [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Santiago de Compostela 15706 (Spain); Pavía, Javier [Servei de Medicina Nuclear, Hospital Clínic, Barcelona (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ros, Doménec [Unitat de Biofísica, Facultat de Medicina, Casanova 143 (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ruibal, Álvaro [Servicio Medicina Nuclear, CHUS (Spain); Grupo de Imaxe Molecular, Facultade de Medicina (USC), IDIS, Santiago de Compostela 15706 (Spain); Fundación Tejerina, Madrid (Spain); and others
2014-03-15
Purpose: To assess the performance of two approaches to the system response matrix (SRM) calculation in pinhole single photon emission computed tomography (SPECT) reconstruction. Methods: Evaluation was performed using experimental data from a low magnification pinhole SPECT system that consisted of a rotating flat detector with a monolithic scintillator crystal. The SRM was computed following two approaches, which were based on Monte Carlo simulations (MC-SRM) and analytical techniques in combination with an experimental characterization (AE-SRM). The spatial response of the system, obtained by using the two approaches, was compared with experimental data. The effect of the MC-SRM and AE-SRM approaches on the reconstructed image was assessed in terms of image contrast, signal-to-noise ratio, image quality, and spatial resolution. To this end, acquisitions were carried out using a hot cylinder phantom (consisting of five fillable rods with diameters of 5, 4, 3, 2, and 1 mm and a uniform cylindrical chamber) and a custom-made Derenzo phantom, with center-to-center distances between adjacent rods of 1.5, 2.0, and 3.0 mm. Results: Good agreement was found for the spatial response of the system between measured data and results derived from MC-SRM and AE-SRM. Only minor differences for point sources at distances smaller than the radius of rotation and large incidence angles were found. Assessment of the effect on the reconstructed image showed a similar contrast for both approaches, with values higher than 0.9 for rod diameters greater than 1 mm and higher than 0.8 for rod diameter of 1 mm. The comparison in terms of image quality showed that all rods in the different sections of a custom-made Derenzo phantom could be distinguished. The spatial resolution (FWHM) was 0.7 mm at iteration 100 using both approaches. The SNR was lower for reconstructed images using MC-SRM than for those reconstructed using AE-SRM, indicating that AE-SRM deals better with the
Evaluation of a commercial electron treatment planning system based on Monte Carlo techniques (eMC).
Pemler, Peter; Besserer, Jürgen; Schneider, Uwe; Neuenschwander, Hans
2006-01-01
A commercial electron beam treatment planning system on the basis of a Monte Carlo algorithm (Varian Eclipse, eMC V7.2.35) was evaluated. Measured dose distributions were used for comparison with dose distributions predicted by eMC calculations. Tests were carried out for various applicators and field sizes, irregular shaped cut outs and an inhomogeneity phantom for energies between 6 Me V and 22 MeV Monitor units were calculated for all applicator/energy combinations and field sizes down to 3 cm diameter and source-to-surface distances of 100 cm and 110 cm. A mass-density-to-Hounsfield-Units calibration was performed to compare dose distributions calculated with a default and an individual calibration. The relationship between calculation parameters of the eMC and the resulting dose distribution was studied in detail. Finally, the algorithm was also applied to a clinical case (boost treatment of the breast) to reveal possible problems in the implementation. For standard geometries there was a good agreement between measurements and calculations, except for profiles for low energies (6 MeV) and high energies (18 Me V 22 MeV), in which cases the algorithm overestimated the dose off-axis in the high-dose region. For energies of 12 MeV and higher there were oscillations in the plateau region of the corresponding depth dose curves calculated with a grid size of 1 mm. With irregular cut outs, an overestimation of the dose was observed for small slits and low energies (4% for 6 MeV), as well as for asymmetric cases and extended source-to-surface distances (12% for SSD = 120 cm). While all monitor unit calculations for SSD = 100 cm were within 3% compared to measure-ments, there were large deviations for small cut outs and source-to-surface distances larger than 100 cm (7%for a 3 cm diameter cut-out and a source-to-surface distance of 10 cm).
SU-E-T-442: Geometric Calibration and Verification of a GammaPod Breast SBRT System
Energy Technology Data Exchange (ETDEWEB)
Yu, C [Univ Maryland School of Medicine, Baltimore, MD (United States); Xcision Medical Systems, Columbia, MD (United States); Niu, Y; Maton, P; Hoban, P [Xcision Medical Systems, Columbia, MD (United States); Mutaf, Y [Univ Maryland School of Medicine, Baltimore, MD (United States)
2015-06-15
Purpose: The first GammaPod™ unit for prone stereotactic treatment of early stage breast cancer has recently been installed and calibrated. Thirty-six rotating circular Co-60 beams focus dose at an isocenter that traverses throughout a breast target via continuous motion of the treatment table. The breast is immobilized and localized using a vacuum-assisted stereotactic cup system that is fixed to the table during treatment. Here we report on system calibration and on verification of geometric and dosimetric accuracy. Methods: Spatial calibration involves setting the origin of each table translational axis within the treatment control system such that the relationship between beam isocenter and table geometry is consistent with that assumed by the treatment planning system. A polyethylene QA breast phantom inserted into an aperture in the patient couch is used for calibration and verification. The comparison is performed via fiducial-based registration of measured single-isocenter dose profiles (radiochromic film) with kernel dose profiles. With the table calibrations applied, measured relative dose distributions were compared with TPS calculations for single-isocenter and dynamic (many-isocenter) treatment plans. Further, table motion accuracy and linearity was tested via comparison of planned control points with independent encoder readouts. Results: After table calibration, comparison of measured and calculated single-isocenter dose profiles show agreement to within 0.5 mm for each axis. Gamma analysis of measured vs calculated profiles with 3%/2mm criteria yields a passing rate of >99% and >98% for single-isocenter and dynamic plans respectively. This also validates the relative dose distributions produced by the TPS. Measured table motion accuracy was within 0.05 mm for all translational axes. Conclusion: GammaPod table coordinate calibration is a straightforward process that yields very good agreement between planned and measured relative dose distributions
Parallel J-W Monte Carlo Simulations of Thermal Phase Changes in Finite-size Systems
Radev, R
2002-01-01
The thermodynamic properties of 59 TeF6 clusters that undergo temperature-driven phase transitions have been calculated with a canonical J-walking Monte Carlo technique. A parallel code for simulations has been developed and optimized on SUN3500 and CRAY-T3E computers. The Lindemann criterion shows that the clusters transform from liquid to solid and then from one solid structure to another in the temperature region 60-130 K.
Multilevel Monte Carlo methods for computing failure probability of porous media flow systems
Fagerlund, F.; Hellman, F.; Målqvist, A.; Niemi, A.
2016-08-01
We study improvements of the standard and multilevel Monte Carlo method for point evaluation of the cumulative distribution function (failure probability) applied to porous media two-phase flow simulations with uncertain permeability. To illustrate the methods, we study an injection scenario where we consider sweep efficiency of the injected phase as quantity of interest and seek the probability that this quantity of interest is smaller than a critical value. In the sampling procedure, we use computable error bounds on the sweep efficiency functional to identify small subsets of realizations to solve highest accuracy by means of what we call selective refinement. We quantify the performance gains possible by using selective refinement in combination with both the standard and multilevel Monte Carlo method. We also identify issues in the process of practical implementation of the methods. We conclude that significant savings in computational cost are possible for failure probability estimation in a realistic setting using the selective refinement technique, both in combination with standard and multilevel Monte Carlo.
Poyneer, Lisa A; Macintosh, Bruce; Palmer, David W; Perrin, Marshall D; Sadakuni, Naru; Savransky, Dmitry; Bauman, Brian; Cardwell, Andrew; Chilcote, Jeffrey K; Dillon, Daren; Gavel, Donald; Goodsell, Stephen J; Hartung, Markus; Hibon, Pascale; Rantakyro, Fredrik T; Thomas, Sandrine; Veran, Jean-Pierre
2014-01-01
The Gemini Planet Imager instrument's adaptive optics (AO) subsystem was designed specifically to facilitate high-contrast imaging. It features several new technologies, including computationally efficient wavefront reconstruction with the Fourier transform, modal gain optimization every 8 seconds, and the spatially filtered wavefront sensor. It also uses a Linear-Quadratic-Gaussian (LQG) controller (aka Kalman filter) for both pointing and focus. We present on-sky performance results from verification and commissioning runs from December 2013 through May 2014. The efficient reconstruction and modal gain optimization are working as designed. The LQG controllers effectively notch out vibrations. The spatial filter can remove aliases, but we typically use it oversized by about 60% due to stability problems.
Directory of Open Access Journals (Sweden)
L. Foresti
2015-07-01
Full Text Available The Short-Term Ensemble Prediction System (STEPS is implemented in real-time at the Royal Meteorological Institute (RMI of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE. STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60–90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80–90 % of the forecast errors.
A user`s manual for MASH 1.0: A Monte Carlo Adjoint Shielding Code System
Energy Technology Data Exchange (ETDEWEB)
Johnson, J.O. [ed.
1992-03-01
The Monte Carlo Adjoint Shielding Code System, MASH, calculates neutron and gamma-ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air-over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system include the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. MASH is the successor to the Vehicle Code System (VCS) initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the ``dose importance`` of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response a a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user`s manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem (input data and selected output edits) for each code.
A user's manual for MASH 1. 0: A Monte Carlo Adjoint Shielding Code System
Energy Technology Data Exchange (ETDEWEB)
Johnson, J.O. (ed.)
1992-03-01
The Monte Carlo Adjoint Shielding Code System, MASH, calculates neutron and gamma-ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air-over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system include the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. MASH is the successor to the Vehicle Code System (VCS) initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the dose importance'' of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response a a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem (input data and selected output edits) for each code.
A User's Manual for MASH V1.5 - A Monte Carlo Adjoint Shielding Code System
Energy Technology Data Exchange (ETDEWEB)
C. O. Slater; J. M. Barnes; J. O. Johnson; J.D. Drischler
1998-10-01
The Monte Carlo ~djoint ~ielding Code System, MASH, calculates neutron and gamma- ray environments and radiation protection factors for armored military vehicles, structures, trenches, and other shielding configurations by coupling a forward discrete ordinates air- over-ground transport calculation with an adjoint Monte Carlo treatment of the shielding geometry. Efficiency and optimum use of computer time are emphasized. The code system includes the GRTUNCL and DORT codes for air-over-ground transport calculations, the MORSE code with the GIFT5 combinatorial geometry package for adjoint shielding calculations, and several peripheral codes that perform the required data preparations, transformations, and coupling functions. The current version, MASH v 1.5, is the successor to the original MASH v 1.0 code system initially developed at Oak Ridge National Laboratory (ORNL). The discrete ordinates calculation determines the fluence on a coupling surface surrounding the shielding geometry due to an external neutron/gamma-ray source. The Monte Carlo calculation determines the effectiveness of the fluence at that surface in causing a response in a detector within the shielding geometry, i.e., the "dose importance" of the coupling surface fluence. A coupling code folds the fluence together with the dose importance, giving the desired dose response. The coupling code can determine the dose response as a function of the shielding geometry orientation relative to the source, distance from the source, and energy response of the detector. This user's manual includes a short description of each code, the input required to execute the code along with some helpful input data notes, and a representative sample problem.
Energy Technology Data Exchange (ETDEWEB)
Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.
Verification testing of the Watts Premier M-Series M-15,000 RO Treatment System was conducted over a 31-day period from April 26, 2004, through May 26, 2004. This test was conducted at the Coachella Valley Water District (CVWD) Well 7802 in Thermal, California. The source water...
Verification testing of the US Filter 3M10C membrane system was conducted over a 44-day test period at the Aqua 2000 Research Center in Chula Vista, California. The test period extended from July 24, 2002 to September 5, 2002. The source water was a blend of Colorado River and ...
DEFF Research Database (Denmark)
Tycho, Andreas; Jørgensen, Thomas Martini; Andersen, Peter E.
2002-01-01
A Monte Carlo (MC) method for modeling optical coherence tomography (OCT) measurements of a diffusely reflecting discontinuity emb edded in a scattering medium is presented. For the first time to the authors' knowledge it is shown analytically that the applicability of an MC approach to this opti......A Monte Carlo (MC) method for modeling optical coherence tomography (OCT) measurements of a diffusely reflecting discontinuity emb edded in a scattering medium is presented. For the first time to the authors' knowledge it is shown analytically that the applicability of an MC approach...... to this optical geometry is firmly justified, because, as we show, in the conjugate image plane the field reflected from the sample is delta-correlated from which it follows that the heterodyne signal is calculated from the intensity distribution only. This is not a trivial result because, in general, the light...... focused beam, and it is shown that in free space the full three-dimensional intensity distribution of a Gaussian beam is obtained. The OCT signal and the intensity distribution in a scattering medium have been obtained for several geometries with the suggested MC method; when this model and a recently...
Roxby, P; Kron, T; Foroudi, F; Haworth, A; Fox, C; Mullen, A; Cramb, J
2009-10-01
Cone-beam computed tomography (CBCT) is a three-dimensional imaging modality that has recently become available on linear accelerators for radiotherapy patient position verification. It was the aim of the present study to implement simple strategies for reduction of the dose delivered in a commercial CBCT system. The dose delivered in a CBCT procedure (Varian, half-fan acquisition, 650 projections, 125 kVp) was assessed using a cylindrical Perspex phantom (diameter, 32 cm) with a calibrated Farmer type ionisation chamber. A copper filter (thickness, 0.15 mm) was introduced increasing the half value layer of the beam from 5.5 mm Al to 8 mm Al. Image quality and noise were assessed using an image quality phantom (CatPhan) while the exposure settings per projection were varied from 25 ms/80 mA to 2 ms/2 mA per projection. Using the copper filter reduced the dose to the phantom from approximately 45 mGy to 30 mGy at standard settings (centre/periphery weighting 1/3 to 2/3). Multiple CBCT images were acquired for six patients with pelvic malignancies to compare CBCTs with and without a copper filter. Although the reconstructed image is somewhat noisier with the filter, it features similar contrast in the centre of the patient and was often preferred by the radiation oncologist because of greater image uniformity. The X-ray shutters were adjusted to the minimum size required to obtain the desired image volume for a given patient diameter. The simple methods described here reduce the effective dose to patients undergoing daily CBCT and are easy to implement, and initial evidence suggests that they do not affect the ability to identify soft tissue for the purpose of treatment verification.
Nord, B.; Buckley-Geer, E.; Lin, H.; Diehl, H. T.; Helsby, J.; Kuropatkin, N.; Amara, A.; Collett, T.; Allam, S.; Caminha, G. B.; De Bom, C.; Desai, S.; Dúmet-Montoya, H.; Pereira, M. Elidaiana da S.; Finley, D. A.; Flaugher, B.; Furlanetto, C.; Gaitsch, H.; Gill, M.; Merritt, K. W.; More, A.; Tucker, D.; Saro, A.; Rykoff, E. S.; Rozo, E.; Birrer, S.; Abdalla, F. B.; Agnello, A.; Auger, M.; Brunner, R. J.; Carrasco Kind, M.; Castander, F. J.; Cunha, C. E.; da Costa, L. N.; Foley, R. J.; Gerdes, D. W.; Glazebrook, K.; Gschwend, J.; Hartley, W.; Kessler, R.; Lagattuta, D.; Lewis, G.; Maia, M. A. G.; Makler, M.; Menanteau, F.; Niernberg, A.; Scolnic, D.; Vieira, J. D.; Gramillano, R.; Abbott, T. M. C.; Banerji, M.; Benoit-Lévy, A.; Brooks, D.; Burke, D. L.; Capozzi, D.; Carnero Rosell, A.; Carretero, J.; D'Andrea, C. B.; Dietrich, J. P.; Doel, P.; Evrard, A. E.; Frieman, J.; Gaztanaga, E.; Gruen, D.; Honscheid, K.; James, D. J.; Kuehn, K.; Li, T. S.; Lima, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Miquel, R.; Neilsen, E.; Nichol, R. C.; Ogando, R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Walker, A. R.; Wester, W.; Zhang, Y.; DES Collaboration
2016-08-01
We report the observation and confirmation of the first group- and cluster-scale strong gravitational lensing systems found in Dark Energy Survey data. Through visual inspection of data from the Science Verification season, we identified 53 candidate systems. We then obtained spectroscopic follow-up of 21 candidates using the Gemini Multi-object Spectrograph at the Gemini South telescope and the Inamori-Magellan Areal Camera and Spectrograph at the Magellan/Baade telescope. With this follow-up, we confirmed six candidates as gravitational lenses: three of the systems are newly discovered, and the remaining three were previously known. Of the 21 observed candidates, the remaining 15 either were not detected in spectroscopic observations, were observed and did not exhibit continuum emission (or spectral features), or were ruled out as lensing systems. The confirmed sample consists of one group-scale and five galaxy-cluster-scale lenses. The lensed sources range in redshift z ˜ 0.80-3.2 and in i-band surface brightness i SB ˜ 23-25 mag arcsec-2 (2″ aperture). For each of the six systems, we estimate the Einstein radius θ E and the enclosed mass M enc, which have ranges θ E ˜ 5″-9″ and M enc ˜ 8 × 1012 to 6 × 1013 M ⊙, respectively. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.
Energy Technology Data Exchange (ETDEWEB)
Ayala Lazaro, R.; Garcia Hernandez, M. J.; Gomez Cores, S.; Jimenez Rojas, R.; Sendon del Rio, J. R.; Polo Cezon, R.; Gomez Calvar, R.
2013-07-01
The use of electronic devices of image portal (EPID) is considered fast, effectively and without added cost of verification of static or dynamic IMRT treatments. Its implementation as a verification tool, however, can be quite complicated. Presents an easy way of setting up this system using the method of Lee et to the. and using Elekta MONACO Planner. (Author)
Lerner, Sorin; Kundu, Sudipta
2011-01-01
Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based
Fuchs, M.; Ireta, J.; Scheffler, M.; Filippi, C.
2006-03-01
Dispersion (Van der Waals) forces are important in many molecular phenomena such as self-assembly of molecular crystals or peptide folding. Calculating this nonlocal correlation effect requires accurate electronic structure methods. Usual density-functional theory with generalized gradient functionals (GGA-DFT) fails unless empirical corrections are added that still need extensive validation. Quantum chemical methods like MP2 and coupled cluster are more accurate, yet limited to rather small systems by their unfavorable computational scaling. Diffusion Monte Carlo (DMC) can provide accurate molecular total energies and remains feasible also for larger systems. Here we apply the fixed-node DMC method to (bio-)molecular model systems where dispersion forces are significant: (dimethyl-) formamide and benzene dimers, and adenine-thymine DNA base pairs. Our DMC binding energies agree well with data from coupled cluster (CCSD(T)), in particular for stacked geometries where GGA-DFT fails qualitatively and MP2 predicts too strong binding.
Kamibayashi, Yuki; Miura, Shinichi
2016-08-01
In the present study, variational path integral molecular dynamics and associated hybrid Monte Carlo (HMC) methods have been developed on the basis of a fourth order approximation of a density operator. To reveal various parameter dependence of physical quantities, we analytically solve one dimensional harmonic oscillators by the variational path integral; as a byproduct, we obtain the analytical expression of the discretized density matrix using the fourth order approximation for the oscillators. Then, we apply our methods to realistic systems like a water molecule and a para-hydrogen cluster. In the HMC, we adopt two level description to avoid the time consuming Hessian evaluation. For the systems examined in this paper, the HMC method is found to be about three times more efficient than the molecular dynamics method if appropriate HMC parameters are adopted; the advantage of the HMC method is suggested to be more evident for systems described by many body interaction.
Energy Technology Data Exchange (ETDEWEB)
Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.
A Monte Carlo method for critical systems in infinite volume: the planar Ising model
Herdeiro, Victor
2016-01-01
In this paper we propose a Monte Carlo method for generating finite-domain marginals of critical distributions of statistical models in infinite volume. The algorithm corrects the problem of the long-range effects of boundaries associated to generating critical distributions on finite lattices. It uses the advantage of scale invariance combined with ideas of the renormalization group in order to construct a type of "holographic" boundary condition that encodes the presence of an infinite volume beyond it. We check the quality of the distribution obtained in the case of the planar Ising model by comparing various observables with their infinite-plane prediction. We accurately reproduce planar two-, three- and four-point functions of spin and energy operators. We also define a lattice stress-energy tensor, and numerically obtain the associated conformal Ward identities and the Ising central charge.
Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems
Ma, Xiaoyao; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2015-01-01
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H$_{2}$O, N$_2$, and F$_2$ molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901 (United States); Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H{sub 2}O, N{sub 2}, and F{sub 2} molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901, USA; Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Bhaskaran-Nair, Kiran [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Jarrell, Mark [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Monte Carlo method for critical systems in infinite volume: The planar Ising model.
Herdeiro, Victor; Doyon, Benjamin
2016-10-01
In this paper we propose a Monte Carlo method for generating finite-domain marginals of critical distributions of statistical models in infinite volume. The algorithm corrects the problem of the long-range effects of boundaries associated to generating critical distributions on finite lattices. It uses the advantage of scale invariance combined with ideas of the renormalization group in order to construct a type of "holographic" boundary condition that encodes the presence of an infinite volume beyond it. We check the quality of the distribution obtained in the case of the planar Ising model by comparing various observables with their infinite-plane prediction. We accurately reproduce planar two-, three-, and four-point of spin and energy operators. We also define a lattice stress-energy tensor, and numerically obtain the associated conformal Ward identities and the Ising central charge.
Williams, David E.; Spector Lawrence N.
2010-01-01
Node 1 (Unity) flew to International Space Station (ISS) on Flight 2A. Node 1 was the first module of the United States On-Orbit Segment (USOS) launched to ISS. The Node 1 ISS Environmental Control and Life Support (ECLS) design featured limited ECLS capability. The main purpose of Node 1 was to provide internal storage by providing four stowage rack locations within the module and to allow docking of multiple modules and a truss segment to it. The ECLS subsystems inside Node 1 were routed through the element prior to launch to allow for easy integration of the attached future elements, particularly the Habitation Module which was planned to be located at the nadir docking port of Node 1. After Node I was on-orbit, the Program decided not to launch the Habitation Module and instead, to replace it with Node 3 (Tranquility). In 2007, the Program became concerned with a potential Russian docking port approach issue for the Russian FGB nadir docking port after Node 3 is attached to Node 1. To solve this concern the Program decided to relocate Node 3 from Node I nadir to Node 1 port. To support the movement of Node 3 the Program decided to build a modification kit for Node 1, an on-orbit feedthrough leak test device, and new vestibule jumpers to support the ECLS part of the relocation. This paper provides a design overview of the modification kit for Node 1, a summary of the Node 1 ECLS re-verification to support the Node 3 relocation from Node 1 nadir to Node 1 port, and a status of the ECLS modification kit installation into Node 1.
Directory of Open Access Journals (Sweden)
Biniam Tesfamicael
2016-03-01
Full Text Available Purpose: The main purpose of this study was to monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers.Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate a proton therapy of prostate cancer. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cm3 Delrin® blocks were used to monitor the emission of secondary particles in the transverse (left and right and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were implemented to extract the energy deposited in each fiber and inside the scintillating block.Results: The transverse dose distributions from the detected secondary particles in both cases are symmetric and agree to within <3.6%. The energy deposited gradually increases as one moves from the peripheral row of fibers towards the center of the block (aligned with the center of the prostate by a factor of approximately 5. The energy deposited was also observed to decrease as one goes from the frontal to distal region of the block. The ratio of the energy deposited in the prostate to the energy deposited in the middle two rows of fibers showed a linear relationship with a slope of (-3.55±2.26 × 10-5 MeV per treatment Gy delivered. The distal detectors recorded a negligible amount of energy deposited due to higher attenuation of the secondary particles by the water in that direction.Conclusion: With a good calibration and with the ability to define a good correlation between the radiation flux recorded by the external fibers and the dose delivered to the prostate, such fibers can be used for real time dose verification to the target. The system was also observed to respond to the series of Bragg Peaks used to generate the
Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex
2008-01-01
Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.
Biometric verification with correlation filters
Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit
2004-01-01
Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.
Institute of Scientific and Technical Information of China (English)
Ahmed Shawky Shawata; Tarek El Nimr; Reda Ahmed Morsy; Khaled M. Elshahat
2013-01-01
Objective: This study aimed to evaluate of the accuracy and efficiency of the in-vivo dosimetry systems for routine cancer patient dose verification. Methods: In vivo dosimetry, using diodes and thermoluminescent dosimeters (TLD) is performed in many radiotherapy departments to verify the dose delivered during treatment. A total of 40 TLD divided into two batches (one of 20 and other of 20 TLD) were used. Different doses of Co60 beam were delivered to the TLD chips at different depths. Diodes were irradiated at different depths in a (30 × 30 × 30) cm3 water slab phantom with various conditions of Field sizes, monitor units and SSDs. Results: The limitation of the in-vivo dosimetry technique is that dose can only be in system readout difficulty and type of readout (TLD system and diode) as the patient dose is directly measured. Several authors have investigated the measurements was 1.3%, with a standard deviation of 2.6%. Results were normally distributed around a both eyes were 1.8%, with a standard deviation of 2.7%. These results are similar to studies conducted with diodes and TLD's. Conclusion: The diode is superior to TLD, since the diode measurements can be obtained on line and allows an immediate check. Other advantages of diodes include high sensitivity, good spatial resolution, and small size, simplicity of used.
Validation and Verification (V&V) of Safety-Critical Systems Operating Under Off-Nominal Conditions
Belcastro, Christine M.
2012-01-01
Loss of control (LOC) remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft LOC accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. Research is underway at the National Aeronautics and Space Administration (NASA) in the development of advanced onboard system technologies for preventing or recovering from loss of vehicle control and for assuring safe operation under off-nominal conditions associated with aircraft LOC accidents. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V&V) and ultimate certification. The V&V of complex integrated systems poses highly significant technical challenges and is the subject of a parallel research effort at NASA. This chapter summarizes the V&V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft LOC accidents. A summary of recent research accomplishments in this effort is referenced.
Energy Technology Data Exchange (ETDEWEB)
Ramirez Ros, J. C.; Jerez Sainz, M. I.; Jodar Lopez, C. A.; Lobato Munoz, M.; Ruiz Lopez, M. A.; Carrasco Rodriguez, J. L.; Pamos Urena, M.
2013-07-01
We evaluated the Monte Carlo Monaco Planner v2.0.3 by planners of the SEFM Protocol [1] to the modeling of the photon beam of 6 MV of a linear accelerator Elekta Synergy with collimator MLC Beam Modulator. We compare the Monte Carlo calculation with profiles on water measurement DFS = 100 cm, absorbed dose and dose levels for rectangular, asymmetric fields and different DFS. We compare the results with those obtained with the algorithm Collapsed Cone of Pinnacle Scheduler v8.0m. (Author)
Methods of Software Verification
Directory of Open Access Journals (Sweden)
R. E. Gurin
2015-01-01
Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and
Kanematsu, Nobuyuki; Inaniwa, Taku; Nakao, Minoru
2016-07-01
In the conventional procedure for accurate Monte Carlo simulation of radiotherapy, a CT number given to each pixel of a patient image is directly converted to mass density and elemental composition using their respective functions that have been calibrated specifically for the relevant x-ray CT system. We propose an alternative approach that is a conversion in two steps: the first from CT number to density and the second from density to composition. Based on the latest compilation of standard tissues for reference adult male and female phantoms, we sorted the standard tissues into groups by mass density and defined the representative tissues by averaging the material properties per group. With these representative tissues, we formulated polyline relations between mass density and each of the following; electron density, stopping-power ratio and elemental densities. We also revised a procedure of stoichiometric calibration for CT-number conversion and demonstrated the two-step conversion method for a theoretically emulated CT system with hypothetical 80 keV photons. For the standard tissues, high correlation was generally observed between mass density and the other densities excluding those of C and O for the light spongiosa tissues between 1.0 g cm-3 and 1.1 g cm-3 occupying 1% of the human body mass. The polylines fitted to the dominant tissues were generally consistent with similar formulations in the literature. The two-step conversion procedure was demonstrated to be practical and will potentially facilitate Monte Carlo simulation for treatment planning and for retrospective analysis of treatment plans with little impact on the management of planning CT systems.
DEFF Research Database (Denmark)
Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels
2017-01-01
With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system...
Performance evaluation of Biograph PET/CT system based on Monte Carlo simulation
Wang, Bing; Gao, Fei; Liu, Hua-Feng
2010-10-01
Combined lutetium oxyorthosilicate (LSO) Biograph PET/CT is developed by Siemens Company and has been introduced into medical practice. There is no septa between the scintillator rings, the acquisition mode is full 3D mode. The PET components incorporate three rings of 48 detector blocks which comprises a 13×13 matrix of 4×4×20mm3 elements. The patient aperture is 70cm, the transversal field of view (FOV) is 58.5cm, and the axial field of view is 16.2cm. The CT components adopt 16 slices spiral CT scanner. The physical performance of this PET/CT scanner has been evaluated using Monte Carlo simulation method according to latest NEMA NU 2-2007 standard and the results have been compared with real experiment results. For PET part, in the center FOV the average transversal resolution is 3.67mm, the average axial resolution is 3.94mm, and the 3D-reconstructed scatter fraction is 31.7%. The sensitivities of the PET scanner are 4.21kcps/MBq and 4.26kcps/MBq at 0cm and 10cm off the center of the transversal FOV. The peak NEC is 95.6kcps at a concentration of 39.2kBq/ml. The spatial resolution of CT part is up to 1.12mm at 10mm off the center. The errors between simulated and real results are permitted.
Energy Technology Data Exchange (ETDEWEB)
Rojas C, E.L.; Varon T, C.F.; Pedraza N, R. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)]. e-mail: elrc@nuclear.inin.mx
2007-07-01
The treatment of the breast cancer at early stages is of vital importance. For that, most of the investigations are dedicated to the early detection of the suffering and their treatment. As investigation consequence and clinical practice, in 2002 it was developed in U.S.A. an irradiation system of high dose rate known as Mammosite. In this work we carry out dose calculations for a simplified Mammosite system with the Monte Carlo Penelope simulation code and MCNPX, varying the concentration of the contrast material that it is used in the one. (Author)
A Model for Collaborative Runtime Verification
Testerink, Bas; Bulling, Nils; Dastani, Mehdi
2015-01-01
Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on t
Ishisaki, Y; Fujimoto, R; Ozaki, M; Ebisawa, K; Takahashi, T; Ueda, Y; Ogasaka, Y; Ptak, A; Mukai, K; Hamaguchi, K; Hirayama, M; Kotani, T; Kubo, H; Shibata, R; Ebara, M; Furuzawa, A; Iizuka, R; Inoue, H; Mori, H; Okada, S; Yokoyama, Y; Matsumoto, H; Nakajima, H; Yamaguchi, H; Anabuki, N; Tawa, N; Nagai, M; Katsuda, S; Hayashida, K; Bamba, A; Miller, E D; Sato, K; Yamasaki, N Y
2006-01-01
We have developed a framework for the Monte-Carlo simulation of the X-Ray Telescopes (XRT) and the X-ray Imaging Spectrometers (XIS) onboard Suzaku, mainly for the scientific analysis of spatially and spectroscopically complex celestial sources. A photon-by-photon instrumental simulator is built on the ANL platform, which has been successfully used in ASCA data analysis. The simulator has a modular structure, in which the XRT simulation is based on a ray-tracing library, while the XIS simulation utilizes a spectral "Redistribution Matrix File" (RMF), generated separately by other tools. Instrumental characteristics and calibration results, e.g., XRT geometry, reflectivity, mutual alignments, thermal shield transmission, build-up of the contamination on the XIS optical blocking filters (OBF), are incorporated as completely as possible. Most of this information is available in the form of the FITS (Flexible Image Transport System) files in the standard calibration database (CALDB). This simulator can also be ut...
Institute of Scientific and Technical Information of China (English)
YAO Xiao-yan; LI Peng-lei; DONG Shuai; LIU Jun-ming
2007-01-01
A three-dimensional Ising-like model doped with anti-ferromagnetic (AFM) bonds is proposed to investigate the magnetic properties of a doped triangular spin-chain system by using a Monte-Carlo simulation. The simulated results indicate that a steplike magnetization behavior is very sensitive to the concentration of AFM bonds. A low concentration of AFM bonds can suppress the stepwise behavior considerably, in accordance with doping experiments on Ca3Co206. The analysis of spin snapshots demonstrates that the AFM bond doping not only breaks the ferromagnetic ordered linear spin chains along the hexagonal c-axis but also has a great influence upon the spin configuration in the ab-plane.
A GPU-based Large-scale Monte Carlo Simulation Method for Systems with Long-range Interactions
Liang, Yihao; Li, Yaohang
2016-01-01
In this work we present an efficient implementation of Canonical Monte Carlo simulation for Coulomb many body systems on graphics processing units (GPU). Our method takes advantage of the GPU Single Instruction, Multiple Data (SIMD) architectures. It adopts the sequential updating scheme of Metropolis algorithm, and makes no approximation in the computation of energy. It reaches a remarkable 440-fold speedup, compared with the serial implementation on CPU. We use this method to simulate primitive model electrolytes. We measure very precisely all ion-ion pair correlation functions at high concentrations, and extract renormalized Debye length, renormalized valences of constituent ions, and renormalized dielectric constants. These results demonstrate unequivocally physics beyond the classical Poisson-Boltzmann theory.
Horváthová, L; Mitas, L; Štich, I
2014-01-01
We present calculations of electronic and magnetic structures of vanadium-benzene multidecker clusters V$_{n}$Bz$_{n+1}$ ($n$ = 1 - 3) using advanced quantum Monte Carlo methods. These and related systems have been identified as prospective spin filters in spintronic applications, assuming that their ground states are half-metallic ferromagnets. Although we find that magnetic properties of these multideckers are consistent with ferromagnetic coupling, their electronic structures do not appear to be half-metallic as previously assumed. In fact, they are ferromagnetic insulators with large and broadly similar $\\uparrow$-/$\\downarrow$-spin gaps. This makes the potential of these and related materials as spin filtering devices very limited, unless they are further modified or functionalized.
Pasini, J M; Cordero, P
2001-04-01
We study a one-dimensional granular gas of pointlike particles not subject to gravity between two walls at temperatures T(left) and T(right). The system exhibits two distinct regimes, depending on the normalized temperature difference Delta=(T(right)-T(left))/(T(right)+T(left)): one completely fluidized and one in which a cluster coexists with the fluidized gas. When Delta is above a certain threshold, cluster formation is fully inhibited, obtaining a completely fluidized state. The mechanism that produces these two phases is explained. In the fluidized state the velocity distribution function exhibits peculiar non-Gaussian features. For this state, comparison between integration of the Boltzmann equation using the direct-simulation Monte Carlo method and results stemming from microscopic Newtonian molecular dynamics gives good coincidence, establishing that the non-Gaussian features observed do not arise from the onset of correlations.
National Aeronautics and Space Administration — A comprehensive commercial-grade system for the development of safe parallel and serial programs is developed. The system has the ability to perform efficient...
Energy Technology Data Exchange (ETDEWEB)
SETH, S.S.
2000-01-10
U.S. Department of Energy (DOE) Policy 450.4, Safety Management System Policy commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex as a means of accomplishing its missions safely. DOE Acquisition Regulation 970.5204-2 requires that contractors manage and perform work in accordance with a documented safety management system.
Directory of Open Access Journals (Sweden)
Dewei Tang
2017-03-01
Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.
Amna, S.; Samreen, N.; Khalid, B.; Shamim, A.
2013-06-01
Depending upon the topography, there is an extreme variation in the temperature of Pakistan. Heat waves are the Weather-related events, having significant impact on the humans, including all socioeconomic activities and health issues as well which changes according to the climatic conditions of the area. The forecasting climate is of prime importance for being aware of future climatic changes, in order to mitigate them. The study used the Ensemble Prediction System (EPS) for the purpose of modeling seasonal weather hind-cast of three selected areas i.e., Islamabad, Jhelum and Muzaffarabad. This research was purposely carried out in order to suggest the most suitable climate model for Pakistan. Real time and simulated data of five General Circulation Models i.e., ECMWF, ERA-40, MPI, Meteo France and UKMO for selected areas was acquired from Pakistan Meteorological Department. Data incorporated constituted the statistical temperature records of 32 years for the months of June, July and August. This study was based on EPS to calculate probabilistic forecasts produced by single ensembles. Verification was done out to assess the quality of the forecast t by using standard probabilistic measures of Brier Score, Brier Skill Score, Cross Validation and Relative Operating Characteristic curve. The results showed ECMWF the most suitable model for Islamabad and Jhelum; and Meteo France for Muzaffarabad. Other models have significant results by omitting particular initial conditions.
Night vision imaging system design, integration and verification in spacecraft vacuum thermal test
Shang, Yonghong; Wang, Jing; Gong, Zhe; Li, Xiyuan; Pei, Yifei; Bai, Tingzhu; Zhen, Haijing
2015-08-01
The purposes of spacecraft vacuum thermal test are to characterize the thermal control systems of the spacecraft and its component in its cruise configuration and to allow for early retirement of risks associated with mission-specific and novel thermal designs. The orbit heat flux is simulating by infrared lamp, infrared cage or electric heater. As infrared cage and electric heater do not emit visible light, or infrared lamp just emits limited visible light test, ordinary camera could not operate due to low luminous density in test. Moreover, some special instruments such as satellite-borne infrared sensors are sensitive to visible light and it couldn't compensate light during test. For improving the ability of fine monitoring on spacecraft and exhibition of test progress in condition of ultra-low luminous density, night vision imaging system is designed and integrated by BISEE. System is consist of high-gain image intensifier ICCD camera, assistant luminance system, glare protect system, thermal control system and computer control system. The multi-frame accumulation target detect technology is adopted for high quality image recognition in captive test. Optical system, mechanical system and electrical system are designed and integrated highly adaptable to vacuum environment. Molybdenum/Polyimide thin film electrical heater controls the temperature of ICCD camera. The results of performance validation test shown that system could operate under vacuum thermal environment of 1.33×10-3Pa vacuum degree and 100K shroud temperature in the space environment simulator, and its working temperature is maintains at 5° during two-day test. The night vision imaging system could obtain video quality of 60lp/mm resolving power.
Energy Technology Data Exchange (ETDEWEB)
L. M. Dittmer
2007-03-21
The 100-B-14:2 subsite encompasses the former sanitary sewer feeder lines associated with the 1607-B2 and 1607-B7 septic systems. Feeder lines associated with the 185/190-B building have also been identified as the 100-B-14:8 subsite, and feeder lines associated with the 1607-B7 septic system have also been identified as the 100-B-14:9 subsite. These two subsites have been administratively cancelled to resolve the redundancy. The results of verification sampling show that residual contaminant concentrations do not preclude any future uses and allow for unrestricted use of shallow zone soils. The results also demonstrate that residual contaminant concentrations are protective of groundwater and the Columbia River.
Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John
2006-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.
2013-09-01
The FP250 integrates a modified conventional micro-turbine ( Ingersoll Rand MT250, now manufactured by FlexEnergy Energy Systems) of proven design...conventional micro-turbine ( Ingersoll Rand MT250, now manufactured by Flex Energy Systems) of proven design with a proprietary gradual thermal oxidizer in
The EPA GHG Center collaborated with the New York State Energy Research and Development Authority (NYSERDA) to evaluate the performance of the Climate Energy freewatt Micro-Combined Heat and Power System. The system is a reciprocating internal combustion (IC) engine distributed e...
78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems
2013-01-25
... of a foreign country's food safety system. FSIS conducts these document reviews at least annually... comprehensive assessments of foreign countries' food safety regulatory systems while remaining at USDA... national government is adequately implementing the country's food safety laws and regulations, and...
Verification and controller synthesis for resource-constrained real-time systems
DEFF Research Database (Denmark)
Li, Shuhao; Pettersson, Paul
2010-01-01
integer functions to approximate the continuous resources in real-time embedded systems. Based on these formal models and techniques, we employ the realtime model checker UPPAAL to verify a system against a given functional and/or timing requirement. Furthermore, we employ the timed game solver UPPAAL...
Institute of Scientific and Technical Information of China (English)
Zhang Chunwei; Ou Jinping
2008-01-01
The electromagnetic mass damper (EMD) control system, as an innovative active control system to reducestructural vibration, offers many advantages over traditional active mass driver/damper (AMD) control systems. In this paper,studies of several EMD control strategies and bench-scale shaking table tests of a two-story model structure are described.First, two structural models corresponding to uncontrolled and Zeroed cases are developed, and parameters of these modelsare validated through sinusoidal sweep tests to provide a basis for establishing an accurate mathematical model for furtherstudies. Then, a simplified control strategy for the EMD system based on the pole assignment control algorithm is proposed.Moreover, ideal pole locations are derived and validated through a series of shaking table tests. Finally, three benchmarkearthquake ground motions and sinusoidal sweep waves are imposed onto the structure to investigate the effectiveness andfeasibility of using this type of innovative active control system for structural vibration control. In addition, the robustnessof the EMD system is examined. The test results show that the EMD system is an effective and robust system for the controlof structural vibrations.
A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth
2009-01-01
The development of modern railway and tramway control systems represents a considerable challenge to both systems and software engineers: The goal to increase the traffic throughput while at the same time increasing the availability and reliability of railway operations leads to a demand for more...
Energy Technology Data Exchange (ETDEWEB)
Xie, X; Cao, D; Housley, D; Mehta, V; Shepard, D [Swedish Cancer Institute, Seattle, WA (United States)
2014-06-01
Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-moving platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.
Energy Technology Data Exchange (ETDEWEB)
Ruiz Manzano, P.; Rivas Ballarin, M. A.; Ortega Pardina, P.; Villa Gazulla, D.; Calvo Carrillo, S.; Canellas Anoz, M.; Millan Cebrian, E.
2013-07-01
After the entry into force in 2012, the new Spanish Radiology quality control protocol lists and discusses the results obtained after verification of the automatic control of exposure in computed radiography systems. (Author)
Energy Technology Data Exchange (ETDEWEB)
Nuraslinda, Anuar; Kim, Dong Young; Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Uljugun (Korea, Republic of)
2012-10-15
Steam Generator (SG) level control system in OPR 1000 is one of representative automatic systems that falls under the Supervisory Control level in Endsley's taxonomy. Supervisory control of automated systems is classified as a form of out of the loop (OOTL) performance due to passive involvement in the systems operation, which could lead to loss of situation awareness (SA). There was a reported event, which was caused by inadequate human automation communication that contributed to an unexpected reactor trip in July 2005. A high SG level trip occurred in Yeonggwang (YGN) Unit 6 Nuclear Power Plant (NPP) due to human operator failure to recognize the need to change the control mode of the economizer valve controller (EVC) to manual mode during swap over (the transition from low power mode to high power mode) after the loss of offsite power (LOOP) event was recovered. This paper models the human system interaction in NPP SG level control system using Unified Modeling Language (UML) Activity Diagram. Then, it identifies the missing information for operators in the OPR1000 Main Control Room (MCR) and suggests some means of improving the human system interaction.
Energy Technology Data Exchange (ETDEWEB)
Lee, Seung Kyu; Seo, Hee; Won, Byung Hee; Lee, Hyun Su; Park, Se-Hwan; Kim, Ho-Dong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2014-10-15
The XRF technique compares the measured pulse height of U and Pu peaks which are self-induced characteristic xray emitted from U and Pu to quantify the elemental U and Pu. The measurement of the U and Pu x-ray peak ratio provides information on the relative concentration of U and Pu elements. Photon measurements of spent nuclear fuel using high resolution spectrometers show a large background continuum in the low energy x-ray region in large part from Compton scattering of energetic gamma-rays. The high Compton continuum can make measurements of plutonium x-rays difficult because the relatively small signal to background ratio produced. In pressurized water reactor (PWR) spent fuels with low plutonium contents (-1%), the signal to background ratio may be too low to get an accurate plutonium x-ray measurement. The Compton suppression system has been proposed to reduce the Compton continuum background. In the present study, the feasibility of a Compton suppression system for XRF was evaluated by Monte Carlo simulations and measurements of the radiation source. In this study, the feasibility of a Compton suppression system for XRF was evaluated by MCNP simulations and measurements of the radiation source. Experiments using a standard gamma-ray source showed that the peak-to-total ratios were improved by a factor of three when the Compton suppression system was used.
Mun, J S; Han, M Y
2012-01-01
The appropriate design and evaluation of a rainwater harvesting (RWH) system is necessary to improve system performance and the stability of the water supply. The main design parameters (DPs) of an RWH system are rainfall, catchment area, collection efficiency, tank volume and water demand. Its operational parameters (OPs) include rainwater use efficiency (RUE), water saving efficiency (WSE) and cycle number (CN). The sensitivity analysis of a rooftop RWH system's DPs to its OPs reveals that the ratio of tank volume to catchment area (V/A) for an RWH system in Seoul, South Korea is recommended between 0.03 and 0.08 in terms of rate of change in RUE. The appropriate design value of V/A is varied with D/A. The extra tank volume up to V/A of 0.15∼0.2 is also available, if necessary to secure more water. Accordingly, we should figure out suitable value or range of DPs based on the sensitivity analysis to optimize design of an RWH system or improve operation efficiency. The operational data employed in this study, which was carried out to validate the design and evaluation method of an RWH system, were obtained from the system in use at a dormitory complex at Seoul National University (SNU) in Korea. The results of these operational data are in good agreement with those used in the initial simulation. The proposed method and the results of this research will be useful in evaluating and comparing the performance of RWH systems. It is found that RUE can be increased by expanding the variety of rainwater uses, particularly in the high rainfall season.