WorldWideScience

Sample records for high accuracy verification

  1. Dose delivery verification and accuracy assessment of stereotaxy in stereotactic radiotherapy and radiosurgery

    International Nuclear Information System (INIS)

    Pelagade, S.M.; Bopche, T.T.; Namitha, K.; Munshi, M.; Bhola, S.; Sharma, H.; Patel, B.K.; Vyas, R.K.

    2008-01-01

    The outcome of stereotactic radiotherapy (SRT) and stereotactic radiosurgery (SRS) in both benign and malignant tumors within the cranial region highly depends on precision in dosimetry, dose delivery and the accuracy assessment of stereotaxy associated with the unit. The frames BRW (Brown-Roberts-Wells) and GTC (Gill- Thomas-Cosman) can facilitate accurate patient positioning as well as precise targeting of tumours. The implementation of this technique may result in a significant benefit as compared to conventional therapy. As the target localization accuracy is improved, the demand for treatment planning accuracy of a TPS is also increased. The accuracy of stereotactic X Knife treatment planning system has two components to verify: (i) the dose delivery verification and the accuracy assessment of stereotaxy; (ii) to ensure that the Cartesian coordinate system associated is well established within the TPS for accurate determination of a target position. Both dose delivery verification and target positional accuracy affect dose delivery accuracy to a defined target. Hence there is a need to verify these two components in quality assurance protocol. The main intention of this paper is to present our dose delivery verification procedure using cylindrical wax phantom and accuracy assessment (target position) of stereotaxy using Geometric Phantom on Elekta's Precise linear accelerator for stereotactic installation

  2. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    Directory of Open Access Journals (Sweden)

    Coomarasamy Aravinthan

    2004-05-01

    Full Text Available Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects. Of these, 16 had immediate histological verification of diagnosis while 11 had verification delayed > 24 hrs after testing. The effect of delay in verification of diagnosis on estimates of accuracy was evaluated using meta-regression with diagnostic odds ratio (dOR as the accuracy measure. This analysis was adjusted for study quality and type of test (miniature endometrial biopsy or endometrial ultrasound. Results Compared to studies with immediate verification of diagnosis (dOR 67.2, 95% CI 21.7–208.8, those with delayed verification (dOR 16.2, 95% CI 8.6–30.5 underestimated the diagnostic accuracy by 74% (95% CI 7%–99%; P value = 0.048. Conclusion Among studies of miniature endometrial biopsy and endometrial ultrasound, diagnostic accuracy is considerably underestimated if there is a delay in histological verification of diagnosis.

  3. Impact of the frequency of online verifications on the patient set-up accuracy and set-up margins

    International Nuclear Information System (INIS)

    Rudat, Volker; Hammoud, Mohamed; Pillay, Yogin; Alaradi, Abdul Aziz; Mohamed, Adel; Altuwaijri, Saleh

    2011-01-01

    The purpose of the study was to evaluate the patient set-up error of different anatomical sites, to estimate the effect of different frequencies of online verifications on the patient set-up accuracy, and to calculate margins to accommodate for the patient set-up error (ICRU set-up margin, SM). Alignment data of 148 patients treated with inversed planned intensity modulated radiotherapy (IMRT) or three-dimensional conformal radiotherapy (3D-CRT) of the head and neck (n = 31), chest (n = 72), abdomen (n = 15), and pelvis (n = 30) were evaluated. The patient set-up accuracy was assessed using orthogonal megavoltage electronic portal images of 2328 fractions of 173 planning target volumes (PTV). In 25 patients, two PTVs were analyzed where the PTVs were located in different anatomical sites and treated in two different radiotherapy courses. The patient set-up error and the corresponding SM were retrospectively determined assuming no online verification, online verification once a week and online verification every other day. The SM could be effectively reduced with increasing frequency of online verifications. However, a significant frequency of relevant set-up errors remained even after online verification every other day. For example, residual set-up errors larger than 5 mm were observed on average in 18% to 27% of all fractions of patients treated in the chest, abdomen and pelvis, and in 10% of fractions of patients treated in the head and neck after online verification every other day. In patients where high set-up accuracy is desired, daily online verification is highly recommended

  4. Impact of the frequency of online verifications on the patient set-up accuracy and set-up margins

    Directory of Open Access Journals (Sweden)

    Mohamed Adel

    2011-08-01

    Full Text Available Abstract Purpose The purpose of the study was to evaluate the patient set-up error of different anatomical sites, to estimate the effect of different frequencies of online verifications on the patient set-up accuracy, and to calculate margins to accommodate for the patient set-up error (ICRU set-up margin, SM. Methods and materials Alignment data of 148 patients treated with inversed planned intensity modulated radiotherapy (IMRT or three-dimensional conformal radiotherapy (3D-CRT of the head and neck (n = 31, chest (n = 72, abdomen (n = 15, and pelvis (n = 30 were evaluated. The patient set-up accuracy was assessed using orthogonal megavoltage electronic portal images of 2328 fractions of 173 planning target volumes (PTV. In 25 patients, two PTVs were analyzed where the PTVs were located in different anatomical sites and treated in two different radiotherapy courses. The patient set-up error and the corresponding SM were retrospectively determined assuming no online verification, online verification once a week and online verification every other day. Results The SM could be effectively reduced with increasing frequency of online verifications. However, a significant frequency of relevant set-up errors remained even after online verification every other day. For example, residual set-up errors larger than 5 mm were observed on average in 18% to 27% of all fractions of patients treated in the chest, abdomen and pelvis, and in 10% of fractions of patients treated in the head and neck after online verification every other day. Conclusion In patients where high set-up accuracy is desired, daily online verification is highly recommended.

  5. 40 CFR 1065.305 - Verifications for accuracy, repeatability, and noise.

    Science.gov (United States)

    2010-07-01

    ..., repeatability, and noise. 1065.305 Section 1065.305 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Verifications for accuracy, repeatability, and noise. (a) This section describes how to determine the accuracy, repeatability, and noise of an instrument. Table 1 of § 1065.205 specifies recommended values for individual...

  6. Bias associated with delayed verification in test accuracy studies: accuracy of tests for endometrial hyperplasia may be much higher than we think!

    OpenAIRE

    Clark, T Justin; ter Riet, Gerben; Coomarasamy, Aravinthan; Khan, Khalid S

    2004-01-01

    Abstract Background To empirically evaluate bias in estimation of accuracy associated with delay in verification of diagnosis among studies evaluating tests for predicting endometrial hyperplasia. Methods Systematic reviews of all published research on accuracy of miniature endometrial biopsy and endometr ial ultrasonography for diagnosing endometrial hyperplasia identified 27 test accuracy studies (2,982 subjects). Of these, 16 had immediate histological verification of diagnosis while 11 ha...

  7. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  8. Verification of Kaplan turbine cam curves realization accuracy at power plant

    Directory of Open Access Journals (Sweden)

    Džepčeski Dane

    2016-01-01

    Full Text Available Sustainability of approximately constant value of Kaplan turbine efficiency, for relatively large net head changes, is a result of turbine runner variable geometry. Dependence of runner blades position change on guide vane opening represents the turbine cam curve. The cam curve realization accuracy is of great importance for the efficient and proper exploitation of turbines and consequently complete units. Due to the reasons mentioned above, special attention has been given to the tests designed for cam curves verification. The goal of this paper is to provide the description of the methodology and the results of the tests performed in the process of Kaplan turbine cam curves verification.

  9. Dosimetric accuracy of Kodak EDR2 film for IMRT verifications.

    Science.gov (United States)

    Childress, Nathan L; Salehpour, Mohammad; Dong, Lei; Bloch, Charles; White, R Allen; Rosen, Isaac I

    2005-02-01

    Patient-specific intensity-modulated radiotherapy (IMRT) verifications require an accurate two-dimensional dosimeter that is not labor-intensive. We assessed the precision and reproducibility of film calibrations over time, measured the elemental composition of the film, measured the intermittency effect, and measured the dosimetric accuracy and reproducibility of calibrated Kodak EDR2 film for single-beam verifications in a solid water phantom and for full-plan verifications in a Rexolite phantom. Repeated measurements of the film sensitometric curve in a single experiment yielded overall uncertainties in dose of 2.1% local and 0.8% relative to 300 cGy. 547 film calibrations over an 18-month period, exposed to a range of doses from 0 to a maximum of 240 MU or 360 MU and using 6 MV or 18 MV energies, had optical density (OD) standard deviations that were 7%-15% of their average values. This indicates that daily film calibrations are essential when EDR2 film is used to obtain absolute dose results. An elemental analysis of EDR2 film revealed that it contains 60% as much silver and 20% as much bromine as Kodak XV2 film. EDR2 film also has an unusual 1.69:1 silver:halide molar ratio, compared with the XV2 film's 1.02:1 ratio, which may affect its chemical reactions. To test EDR2's intermittency effect, the OD generated by a single 300 MU exposure was compared to the ODs generated by exposing the film 1 MU, 2 MU, and 4 MU at a time to a total of 300 MU. An ion chamber recorded the relative dose of all intermittency measurements to account for machine output variations. Using small MU bursts to expose the film resulted in delivery times of 4 to 14 minutes and lowered the film's OD by approximately 2% for both 6 and 18 MV beams. This effect may result in EDR2 film underestimating absolute doses for patient verifications that require long delivery times. After using a calibration to convert EDR2 film's OD to dose values, film measurements agreed within 2% relative

  10. Verification of the accuracy of Doppler broadened, self-shielded multigroup cross sections for fast power reactor applications

    International Nuclear Information System (INIS)

    Ganesan, S.; Gopalakrishnan, V.; Ramanadhan, M.M.; Cullen, D.E.

    1988-01-01

    Verification results for Doppler broadening and self-shielding are presented. One of the important results presented is that the original SIGMA1 method of numerical Doppler broadening has now been demonstrated to be inaccurate and not capable of producing results to within required accuracies. Fortunately, due to this study, the SIGMA1 method has been significantly improved and the new SIGMA1 is now capable of producing results to within required accuracies. Although this paper presents results based upon using only one code system, it is important to realize that the original SIGMA1 method is presently used in many cross-section processing code systems; the results of this paper indicate that unless these other code systems are updated to include the new SIGMA1 method, the results produced by these code systems could be very inaccurate. The objectives of the IAEA nuclear data processing code verification project are reviewed as well as the requirements for the accuracy of calculation of Doppler coefficients and the present status of these calculations. The initial results of Doppler broadening and self-shielding calculations are presented and the inconsistency of the results which led to the discovery of errors in the original SIGMA1 method of Doppler broadening are pointed out. Analysis of the errors found and improvements in the SIGMA1 method are presented. Improved results are presented in order to demonstrate that the new SIGMA1 method can produce results within required accuracies. Guidelines are presented to limit the uncertainty introduced due to cross-section processing in order to balance available computer resources to accuracy requirements. Finally cross-section processing code users are invited to participate in the IAEA processing code verification project in order to verify the accuracy of their calculated results. (author)

  11. Determination of Solution Accuracy of Numerical Schemes as Part of Code and Calculation Verification

    Energy Technology Data Exchange (ETDEWEB)

    Blottner, F.G.; Lopez, A.R.

    1998-10-01

    This investigation is concerned with the accuracy of numerical schemes for solving partial differential equations used in science and engineering simulation codes. Richardson extrapolation methods for steady and unsteady problems with structured meshes are presented as part of the verification procedure to determine code and calculation accuracy. The local truncation error de- termination of a numerical difference scheme is shown to be a significant component of the veri- fication procedure as it determines the consistency of the numerical scheme, the order of the numerical scheme, and the restrictions on the mesh variation with a non-uniform mesh. Genera- tion of a series of co-located, refined meshes with the appropriate variation of mesh cell size is in- vestigated and is another important component of the verification procedure. The importance of mesh refinement studies is shown to be more significant than just a procedure to determine solu- tion accuracy. It is suggested that mesh refinement techniques can be developed to determine con- sistency of numerical schemes and to determine if governing equations are well posed. The present investigation provides further insight into the conditions and procedures required to effec- tively use Richardson extrapolation with mesh refinement studies to achieve confidence that sim- ulation codes are producing accurate numerical solutions.

  12. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  13. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification of examination procedures in clinical laboratory for imprecision, trueness and diagnostic accuracy according to ISO 15189:2012: a pragmatic approach.

    Science.gov (United States)

    Antonelli, Giorgia; Padoan, Andrea; Aita, Ada; Sciacovelli, Laura; Plebani, Mario

    2017-08-28

    Background The International Standard ISO 15189 is recognized as a valuable guide in ensuring high quality clinical laboratory services and promoting the harmonization of accreditation programmes in laboratory medicine. Examination procedures must be verified in order to guarantee that their performance characteristics are congruent with the intended scope of the test. The aim of the present study was to propose a practice model for implementing procedures employed for the verification of validated examination procedures already used for at least 2 years in our laboratory, in agreement with the ISO 15189 requirement at the Section 5.5.1.2. Methods In order to identify the operative procedure to be used, approved documents were identified, together with the definition of performance characteristics to be evaluated for the different methods; the examination procedures used in laboratory were analyzed and checked for performance specifications reported by manufacturers. Then, operative flow charts were identified to compare the laboratory performance characteristics with those declared by manufacturers. Results The choice of performance characteristics for verification was based on approved documents used as guidance, and the specific purpose tests undertaken, a consideration being made of: imprecision and trueness for quantitative methods; diagnostic accuracy for qualitative methods; imprecision together with diagnostic accuracy for semi-quantitative methods. Conclusions The described approach, balancing technological possibilities, risks and costs and assuring the compliance of the fundamental component of result accuracy, appears promising as an easily applicable and flexible procedure helping laboratories to comply with the ISO 15189 requirements.

  16. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  17. Two high accuracy digital integrators for Rogowski current transducers

    Science.gov (United States)

    Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua

    2014-01-01

    The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.

  18. An angle encoder for super-high resolution and super-high accuracy using SelfA

    Science.gov (United States)

    Watanabe, Tsukasa; Kon, Masahito; Nabeshima, Nobuo; Taniguchi, Kayoko

    2014-06-01

    Angular measurement technology at high resolution for applications such as in hard disk drive manufacturing machines, precision measurement equipment and aspherical process machines requires a rotary encoder with high accuracy, high resolution and high response speed. However, a rotary encoder has angular deviation factors during operation due to scale error or installation error. It has been assumed to be impossible to achieve accuracy below 0.1″ in angular measurement or control after the installation onto the rotating axis. Self-calibration (Lu and Trumper 2007 CIRP Ann. 56 499; Kim et al 2011 Proc. MacroScale; Probst 2008 Meas. Sci. Technol. 19 015101; Probst et al Meas. Sci. Technol. 9 1059; Tadashi and Makoto 1993 J. Robot. Mechatronics 5 448; Ralf et al 2006 Meas. Sci. Technol. 17 2811) and cross-calibration (Probst et al 1998 Meas. Sci. Technol. 9 1059; Just et al 2009 Precis. Eng. 33 530; Burnashev 2013 Quantum Electron. 43 130) technologies for a rotary encoder have been actively discussed on the basis of the principle of circular closure. This discussion prompted the development of rotary tables which achieve reliable and high accuracy angular verification. We apply these technologies for the development of a rotary encoder not only to meet the requirement of super-high accuracy but also to meet that of super-high resolution. This paper presents the development of an encoder with 221 = 2097 152 resolutions per rotation (360°), that is, corresponding to a 0.62″ signal period, achieved by the combination of a laser rotary encoder supplied by Magnescale Co., Ltd and a self-calibratable encoder (SelfA) supplied by The National Institute of Advanced Industrial Science & Technology (AIST). In addition, this paper introduces the development of a rotary encoder to guarantee ±0.03″ accuracy at any point of the interpolated signal, with respect to the encoder at the minimum resolution of 233, that is, corresponding to a 0.0015″ signal period after

  19. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    International Nuclear Information System (INIS)

    Zhao, J; Hu, W; Xing, Y; Wu, X; Li, Y

    2016-01-01

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  20. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, J; Hu, W [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China); Xing, Y [Fudan univercity shanghai proton and heavy ion center, Shanghai (China); Wu, X [Fudan university shanghai proton and heavy ion center, Shanghai, shagnhai (China); Li, Y [Department of Medical physics at Shanghai Proton and Heavy Ion Center, Shanghai, Shanghai (China)

    2016-06-15

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  1. Determining the Accuracy of Crowdsourced Tweet Verification for Auroral Research

    Directory of Open Access Journals (Sweden)

    Nathan A. Case

    2016-12-01

    Full Text Available The Aurorasaurus project harnesses volunteer crowdsourcing to identify sightings of an aurora (the “northern/southern lights” posted by citizen scientists on Twitter. Previous studies have demonstrated that aurora sightings can be mined from Twitter with the caveat that there is a large background level of non-sighting tweets, especially during periods of low auroral activity. Aurorasaurus attempts to mitigate this, and thus increase the quality of its Twitter sighting data, by using volunteers to sift through a pre-filtered list of geolocated tweets to verify real-time aurora sightings. In this study, the current implementation of this crowdsourced verification system, including the process of geolocating tweets, is described and its accuracy (which, overall, is found to be 68.4% is determined. The findings suggest that citizen science volunteers are able to accurately filter out unrelated, spam-like, Twitter data but struggle when filtering out somewhat related, yet undesired, data. The citizen scientists particularly struggle with determining the real-time nature of the sightings, so care must be taken when relying on crowdsourced identification.

  2. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  3. ECG Sensor Verification System with Mean-Interval Algorithm for Handling Sport Issue

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2016-01-01

    Full Text Available With the development of biometric verification, we proposed a new algorithm and personal mobile sensor card system for ECG verification. The proposed new mean-interval approach can identify the user quickly with high accuracy and consumes a small amount of flash memory in the microprocessor. The new framework of the mobile card system makes ECG verification become a feasible application to overcome the issues of a centralized database. For a fair and comprehensive evaluation, the experimental results have been tested on public MIT-BIH ECG databases and our circuit system; they confirm that the proposed scheme is able to provide excellent accuracy and low complexity. Moreover, we also proposed a multiple-state solution to handle the heat rate changes of sports problem. It should be the first to address the issue of sports in ECG verification.

  4. Fission product model for BWR analysis with improved accuracy in high burnup

    International Nuclear Information System (INIS)

    Ikehara, Tadashi; Yamamoto, Munenari; Ando, Yoshihira

    1998-01-01

    A new fission product (FP) chain model has been studied to be used in a BWR lattice calculation. In attempting to establish the model, two requirements, i.e. the accuracy in predicting burnup reactivity and the easiness in practical application, are simultaneously considered. The resultant FP model consists of 81 explicit FP nuclides and two lumped pseudo nuclides having the absorption cross sections independent of burnup history and fuel composition. For the verification, extensive numerical tests covering over a wide range of operational conditions and fuel compositions have been carried out. The results indicate that the estimated errors in burnup reactivity are within 0.1%Δk for exposures up to 100GWd/t. It is concluded that the present model can offer a high degree of accuracy for FP representation in BWR lattice calculation. (author)

  5. An angle encoder for super-high resolution and super-high accuracy using SelfA

    International Nuclear Information System (INIS)

    Watanabe, Tsukasa; Kon, Masahito; Nabeshima, Nobuo; Taniguchi, Kayoko

    2014-01-01

    Angular measurement technology at high resolution for applications such as in hard disk drive manufacturing machines, precision measurement equipment and aspherical process machines requires a rotary encoder with high accuracy, high resolution and high response speed. However, a rotary encoder has angular deviation factors during operation due to scale error or installation error. It has been assumed to be impossible to achieve accuracy below 0.1″ in angular measurement or control after the installation onto the rotating axis. Self-calibration (Lu and Trumper 2007 CIRP Ann. 56 499; Kim et al 2011 Proc. MacroScale; Probst 2008 Meas. Sci. Technol. 19 015101; Probst et al Meas. Sci. Technol. 9 1059; Tadashi and Makoto 1993 J. Robot. Mechatronics 5 448; Ralf et al 2006 Meas. Sci. Technol. 17 2811) and cross-calibration (Probst et al 1998 Meas. Sci. Technol. 9 1059; Just et al 2009 Precis. Eng. 33 530; Burnashev 2013 Quantum Electron. 43 130) technologies for a rotary encoder have been actively discussed on the basis of the principle of circular closure. This discussion prompted the development of rotary tables which achieve reliable and high accuracy angular verification. We apply these technologies for the development of a rotary encoder not only to meet the requirement of super-high accuracy but also to meet that of super-high resolution. This paper presents the development of an encoder with 2 21 = 2097 152 resolutions per rotation (360°), that is, corresponding to a 0.62″ signal period, achieved by the combination of a laser rotary encoder supplied by Magnescale Co., Ltd and a self-calibratable encoder (SelfA) supplied by The National Institute of Advanced Industrial Science and Technology (AIST). In addition, this paper introduces the development of a rotary encoder to guarantee ±0.03″ accuracy at any point of the interpolated signal, with respect to the encoder at the minimum resolution of 2 33 , that is, corresponding to a 0.0015″ signal period

  6. The use of measurement uncertainty in nuclear materials accuracy and verification

    International Nuclear Information System (INIS)

    Alique, O.; Vaccaro, S.; Svedkauskaite, J.

    2015-01-01

    EURATOM nuclear safeguards are based on the nuclear operators’ accounting for and declaring of the amounts of nuclear materials in their possession, as well as on the European Commission verifying the correctness and completeness of such declarations by means of conformity assessment practices. Both the accountancy and the verification processes comprise the measurements of amounts and characteristics of nuclear materials. The uncertainties associated to these measurements play an important role in the reliability of the results of nuclear material accountancy and verification. The document “JCGM 100:2008 Evaluation of measurement data – Guide to the expression of uncertainty in measurement” - issued jointly by the International Bureau of Weights and Measures (BIPM) and international organisations for metrology, standardisation and accreditation in chemistry, physics and electro technology - describes a universal, internally consistent, transparent and applicable method for the evaluation and expression of uncertainty in measurements. This paper discusses different processes of nuclear materials accountancy and verification where measurement uncertainty plays a significant role. It also suggests the way measurement uncertainty could be used to enhance the reliability of the results of the nuclear materials accountancy and verification processes.

  7. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  8. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  9. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  10. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  11. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  12. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  13. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    Directory of Open Access Journals (Sweden)

    Irena Jekova

    2015-01-01

    Full Text Available Traditional means for identity validation (PIN codes, passwords, and physiological and behavioral biometric characteristics (fingerprint, iris, and speech are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI, II (rII, calculated from them first principal ECG component (rPCA, linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension has been considered. In addition a common reference PTB dataset (14 healthy individuals with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.

  14. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  15. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  16. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  17. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  18. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  19. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  20. MobileFaceNets: Efficient CNNs for Accurate Real-time Face Verification on Mobile Devices

    OpenAIRE

    Chen, Sheng; Liu, Yang; Gao, Xiang; Han, Zhen

    2018-01-01

    In this paper, we proposed a class of extremely efficient CNN models, MobileFaceNets, which use less than 1 million parameters and are specifically tailored for high-accuracy real-time face verification on mobile and embedded devices. We first make a simple analysis on the weakness of common mobile networks for face verification. The weakness has been well overcome by our specifically designed MobileFaceNets. Under the same experimental conditions, our MobileFaceNets achieve significantly sup...

  1. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  2. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  3. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D.

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  4. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  6. Standard practice for verification of constant amplitude dynamic forces in an axial fatigue testing system

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures for the dynamic verification of cyclic force amplitude control or measurement accuracy during constant amplitude testing in an axial fatigue testing system. It is based on the premise that force verification can be done with the use of a strain gaged elastic element. Use of this practice gives assurance that the accuracies of forces applied by the machine or dynamic force readings from the test machine, at the time of the test, after any user applied correction factors, fall within the limits recommended in Section 9. It does not address static accuracy which must first be addressed using Practices E 4 or equivalent. 1.2 Verification is specific to a particular test machine configuration and specimen. This standard is recommended to be used for each configuration of testing machine and specimen. Where dynamic correction factors are to be applied to test machine force readings in order to meet the accuracy recommended in Section 9, the verification is also specific to the c...

  7. Accuracy of cell calculation methods used for analysis of high conversion light water reactor lattice

    International Nuclear Information System (INIS)

    Jeong, Chang-Joon; Okumura, Keisuke; Ishiguro, Yukio; Tanaka, Ken-ichi

    1990-01-01

    Validation tests were made for the accuracy of cell calculation methods used in analyses of tight lattices of a mixed-oxide (MOX) fuel core in a high conversion light water reactor (HCLWR). A series of cell calculations was carried out for the lattices referred from an international HCLWR benchmark comparison, with emphasis placed on the resonance calculation methods; the NR, IR approximations, the collision probability method with ultra-fine energy group. Verification was also performed for the geometrical modelling; a hexagonal/cylindrical cell, and the boundary condition; mirror/white reflection. In the calculations, important reactor physics parameters, such as the neutron multiplication factor, the conversion ratio and the void coefficient, were evaluated using the above methods for various HCLWR lattices with different moderator to fuel volume ratios, fuel materials and fissile plutonium enrichments. The calculated results were compared with each other, and the accuracy and applicability of each method were clarified by comparison with continuous energy Monte Carlo calculations. It was verified that the accuracy of the IR approximation became worse when the neutron spectrum became harder. It was also concluded that the cylindrical cell model with the white boundary condition was not so suitable for MOX fuelled lattices, as for UO 2 fuelled lattices. (author)

  8. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    Energy Technology Data Exchange (ETDEWEB)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations

  9. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  10. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  11. Motivation, Critical Thinking and Academic Verification of High School Students' Information-seeking Behavior

    Directory of Open Access Journals (Sweden)

    Z Hidayat

    2017-06-01

    Full Text Available High school students have known as Gen Y or Z and their media using can be understand on their information-seeking behavior. This research’s purposes were: 1 to analyze the students’ motivation; 2 to analyze the critical thinking and academic verification; 3 to analyze the information-seeking behavior. This study used quantitative approach through survey among 1125 respondents in nine clusters, i.e. Central, East, North, West, and South of Jakarta, Tangerang, Bekasi, Depok, and Bogor. Schools sampling based on "the best schools rank" by the government, while respondents have taken by accidental in each school. Construct of questionnaire included measurement of motivation, critical thinking and academic verification, and the information-seeking behavior at all. The results showed that the motivations of the use of Internet were dominated by habit to interact and be entertained while on the academic needs are still relatively small but increasing significantly. Students’ self-efficacy, performance and achievement goals tend to be high motives, however the science learning value, and learning environment stimulation were average low motives. High school students indicated that they think critically about the various things that become content primarily in social media but less critical of the academic information subjects. Unfortunately, high school students did not conducted academic verification on the data and information but students tend to do plagiarism. Key words: Student motivation, critical thinking, academic verification, information-seeking behavior, digital generation.

  12. Results of verification and investigation of wind velocity field forecast. Verification of wind velocity field forecast model

    International Nuclear Information System (INIS)

    Ogawa, Takeshi; Kayano, Mitsunaga; Kikuchi, Hideo; Abe, Takeo; Saga, Kyoji

    1995-01-01

    In Environmental Radioactivity Research Institute, the verification and investigation of the wind velocity field forecast model 'EXPRESS-1' have been carried out since 1991. In fiscal year 1994, as the general analysis, the validity of weather observation data, the local features of wind field, and the validity of the positions of monitoring stations were investigated. The EXPRESS which adopted 500 m mesh so far was improved to 250 m mesh, and the heightening of forecast accuracy was examined, and the comparison with another wind velocity field forecast model 'SPEEDI' was carried out. As the results, there are the places where the correlation with other points of measurement is high and low, and it was found that for the forecast of wind velocity field, by excluding the data of the points with low correlation or installing simplified observation stations to take their data in, the forecast accuracy is improved. The outline of the investigation, the general analysis of weather observation data and the improvements of wind velocity field forecast model and forecast accuracy are reported. (K.I.)

  13. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  14. Geometrical verification system using Adobe Photoshop in radiotherapy.

    Science.gov (United States)

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  15. Assessment of Automated Measurement and Verification (M&V) Methods

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Custodio, Claudine [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sohn, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jump, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-01

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  16. Verification of the Time Accuracy of a Magnometer by Using a GPS Pulse Generator

    Directory of Open Access Journals (Sweden)

    Yasuhiro Minamoto

    2011-05-01

    Full Text Available The time accuracy of geomagnetic data is an important specification for one-second data distributions. We tested a procedure to verify the time accuracy of a fluxgate magnetometer by using a GPS pulse generator. The magnetometer was equipped with a high time resolution (100 Hz output, so the data delay could be checked directly. The delay detected from one-second data by a statistical method was larger than those from 0.1-s- and 0.01-s-resolution data. The test of the time accuracy revealed the larger delay and was useful for verifying the quality of the data.

  17. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    International Nuclear Information System (INIS)

    Acero, R; Pueo, M; Santolaria, J; Aguilar, J J; Brau, A

    2015-01-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures. (paper)

  18. Application of virtual distances methodology to laser tracker verification with an indexed metrology platform

    Science.gov (United States)

    Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.

    2015-11-01

    High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.

  19. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  20. High accuracy FIONA-AFM hybrid imaging

    International Nuclear Information System (INIS)

    Fronczek, D.N.; Quammen, C.; Wang, H.; Kisker, C.; Superfine, R.; Taylor, R.; Erie, D.A.; Tessmer, I.

    2011-01-01

    Multi-protein complexes are ubiquitous and play essential roles in many biological mechanisms. Single molecule imaging techniques such as electron microscopy (EM) and atomic force microscopy (AFM) are powerful methods for characterizing the structural properties of multi-protein and multi-protein-DNA complexes. However, a significant limitation to these techniques is the ability to distinguish different proteins from one another. Here, we combine high resolution fluorescence microscopy and AFM (FIONA-AFM) to allow the identification of different proteins in such complexes. Using quantum dots as fiducial markers in addition to fluorescently labeled proteins, we are able to align fluorescence and AFM information to ≥8 nm accuracy. This accuracy is sufficient to identify individual fluorescently labeled proteins in most multi-protein complexes. We investigate the limitations of localization precision and accuracy in fluorescence and AFM images separately and their effects on the overall registration accuracy of FIONA-AFM hybrid images. This combination of the two orthogonal techniques (FIONA and AFM) opens a wide spectrum of possible applications to the study of protein interactions, because AFM can yield high resolution (5-10 nm) information about the conformational properties of multi-protein complexes and the fluorescence can indicate spatial relationships of the proteins in the complexes. -- Research highlights: → Integration of fluorescent signals in AFM topography with high (<10 nm) accuracy. → Investigation of limitations and quantitative analysis of fluorescence-AFM image registration using quantum dots. → Fluorescence center tracking and display as localization probability distributions in AFM topography (FIONA-AFM). → Application of FIONA-AFM to a biological sample containing damaged DNA and the DNA repair proteins UvrA and UvrB conjugated to quantum dots.

  1. High Accuracy Transistor Compact Model Calibrations

    Energy Technology Data Exchange (ETDEWEB)

    Hembree, Charles E. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Mar, Alan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robertson, Perry J. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirements require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.

  2. Automated Offline Arabic Signature Verification System using Multiple Features Fusion for Forensic Applications

    Directory of Open Access Journals (Sweden)

    Saad M. Darwish

    2016-12-01

    Full Text Available The signature of a person is one of the most popular and legally accepted behavioral biometrics that provides a secure means for verification and personal identification in many applications such as financial, commercial and legal transactions. The objective of the signature verification system is to classify between genuine and forged signatures that are often associated with intrapersonal and interpersonal variability. Unlike other languages, Arabic has unique features; it contains diacritics, ligatures, and overlapping. Because of lacking any form of dynamic information during the Arabic signature’s writing process, it will be more difficult to obtain higher verification accuracy. This paper addresses the above difficulty by introducing a novel offline Arabic signature verification algorithm. The key point is using multiple feature fusion with fuzzy modeling to capture different aspects of a signature individually in order to improve the verification accuracy. State-of-the-art techniques adopt the fuzzy set to describe the properties of the extracted features to handle a signature’s uncertainty; this work also employs the fuzzy variables to describe the degree of similarity of the signature’s features to deal with the ambiguity of questioned document examiner judgment of signature similarity. It is concluded from the experimental results that the verification system performs well and has the ability to reduce both False Acceptance Rate (FAR and False Rejection Rate (FRR.

  3. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  4. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  5. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  6. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  7. Motivation, Critical Thinking and Academic Verification of High School Students' Information-seeking Behavior

    Directory of Open Access Journals (Sweden)

    Z Hidayat

    2018-01-01

    Full Text Available High school students have known as Gen Y or Z and their media using can be understand on their information-seeking behavior. This research’s purposes were: 1 to analyze the students’ motivation; 2 to analyze the critical thinking and academic verification; 3 to analyze the information-seeking behavior. This study used quantitative approach through survey among 1125 respondents in nine clusters, i.e. Central, East, North, West, and South of Jakarta, Tangerang, Bekasi, Depok, and Bogor. Schools sampling based on "the best schools rank" by the government, while respondents have taken by accidental in each school. Construct of questionnaire included measurement of motivation, critical thinking and academic verification, and the information-seeking behavior at all. The results showed that the motivations of the use of Internet were dominated by habit to interact and be entertained while on the academic needs are still relatively small but increasing significantly. Students’ self-efficacy, performance and achievement goals tend to be high motives, however the science learning value, and learning environment stimulation were average low motives. High school students indicated that they think critically about the various things that become content primarily in social media but less critical of the academic information subjects. Unfortunately, high school students did not conducted academic verification on the data and information but students tend to do plagiarism.

  8. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  9. The use of low density high accuracy (LDHA) data for correction of high density low accuracy (HDLA) point cloud

    Science.gov (United States)

    Rak, Michal Bartosz; Wozniak, Adam; Mayer, J. R. R.

    2016-06-01

    Coordinate measuring techniques rely on computer processing of coordinate values of points gathered from physical surfaces using contact or non-contact methods. Contact measurements are characterized by low density and high accuracy. On the other hand optical methods gather high density data of the whole object in a short time but with accuracy at least one order of magnitude lower than for contact measurements. Thus the drawback of contact methods is low density of data, while for non-contact methods it is low accuracy. In this paper a method for fusion of data from two measurements of fundamentally different nature: high density low accuracy (HDLA) and low density high accuracy (LDHA) is presented to overcome the limitations of both measuring methods. In the proposed method the concept of virtual markers is used to find a representation of pairs of corresponding characteristic points in both sets of data. In each pair the coordinates of the point from contact measurements is treated as a reference for the corresponding point from non-contact measurement. Transformation enabling displacement of characteristic points from optical measurement to their match from contact measurements is determined and applied to the whole point cloud. The efficiency of the proposed algorithm was evaluated by comparison with data from a coordinate measuring machine (CMM). Three surfaces were used for this evaluation: plane, turbine blade and engine cover. For the planar surface the achieved improvement was of around 200 μm. Similar results were obtained for the turbine blade but for the engine cover the improvement was smaller. For both freeform surfaces the improvement was higher for raw data than for data after creation of mesh of triangles.

  10. Treatment accuracy of fractionated stereotactic radiotherapy

    International Nuclear Information System (INIS)

    Kumar, Shaleen; Burke, Kevin; Nalder, Colin; Jarrett, Paula; Mubata, Cephas; A'Hern, Roger; Humphreys, Mandy; Bidmead, Margaret; Brada, Michael

    2005-01-01

    Background and purpose: To assess the geometric accuracy of the delivery of fractionated stereotactic radiotherapy (FSRT) for brain tumours using the Gill-Thomas-Cosman (GTC) relocatable frame. Accuracy of treatment delivery was measured via portal images acquired with an amorphous silicon based electronic portal imager (EPI). Results were used to assess the existing verification process and to review the current margins used for the expansion of clinical target volume (CTV) to planning target volume (PTV). Patients and methods: Patients were immobilized in a GTC frame. Target volume definition was performed on localization CT and MRI scans and a CTV to PTV margin of 5 mm (based on initial experience) was introduced in 3D. A Brown-Roberts-Wells (BRW) fiducial system was used for stereotactic coordinate definition. The existing verification process consisted of an intercomparison of the coordinates of the isocentres and anatomy between the localization and verification CT scans. Treatment was delivered with 6 MV photons using four fixed non-coplanar conformal fields using a multi-leaf collimator. Portal imaging verification consisted of the acquisition of orthogonal images centred through the treatment isocentre. Digitally reconstructed radiographs (DRRs) created from the CT localization scans were used as reference images. Semi-automated matching software was used to quantify set up deviations (displacements and rotations) between reference and portal images. Results: One hundred and twenty six anterior and 123 lateral portal images were available for analysis for set up deviations. For displacements, the total errors in the cranial/caudal direction were shown to have the largest SD's of 1.2 mm, while systematic and random errors reached SD's of 1.0 and 0.7 mm, respectively, in the cranial/caudal direction. The corresponding data for rotational errors (the largest deviation was found in the sagittal plane) was 0.7 deg. SD (total error), 0.5 deg. (systematic) and 0

  11. A design of optical modulation system with pixel-level modulation accuracy

    Science.gov (United States)

    Zheng, Shiwei; Qu, Xinghua; Feng, Wei; Liang, Baoqiu

    2018-01-01

    Vision measurement has been widely used in the field of dimensional measurement and surface metrology. However, traditional methods of vision measurement have many limits such as low dynamic range and poor reconfigurability. The optical modulation system before image formation has the advantage of high dynamic range, high accuracy and more flexibility, and the modulation accuracy is the key parameter which determines the accuracy and effectiveness of optical modulation system. In this paper, an optical modulation system with pixel level accuracy is designed and built based on multi-points reflective imaging theory and digital micromirror device (DMD). The system consisted of digital micromirror device, CCD camera and lens. Firstly we achieved accurate pixel-to-pixel correspondence between the DMD mirrors and the CCD pixels by moire fringe and an image processing of sampling and interpolation. Then we built three coordinate systems and calculated the mathematic relationship between the coordinate of digital micro-mirror and CCD pixels using a checkerboard pattern. A verification experiment proves that the correspondence error is less than 0.5 pixel. The results show that the modulation accuracy of system meets the requirements of modulation. Furthermore, the high reflecting edge of a metal circular piece can be detected using the system, which proves the effectiveness of the optical modulation system.

  12. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  13. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    Science.gov (United States)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  14. Diagnostic accuracy of high-definition CT coronary angiography in high-risk patients

    International Nuclear Information System (INIS)

    Iyengar, S.S.; Morgan-Hughes, G.; Ukoumunne, O.; Clayton, B.; Davies, E.J.; Nikolaou, V.; Hyde, C.J.; Shore, A.C.; Roobottom, C.A.

    2016-01-01

    Aim: To assess the diagnostic accuracy of computed tomography coronary angiography (CTCA) using a combination of high-definition CT (HD-CTCA) and high level of reader experience, with invasive coronary angiography (ICA) as the reference standard, in high-risk patients for the investigation of coronary artery disease (CAD). Materials and methods: Three hundred high-risk patients underwent HD-CTCA and ICA. Independent experts evaluated the images for the presence of significant CAD, defined primarily as the presence of moderate (≥50%) stenosis and secondarily as the presence of severe (≥70%) stenosis in at least one coronary segment, in a blinded fashion. HD-CTCA was compared to ICA as the reference standard. Results: No patients were excluded. Two hundred and six patients (69%) had moderate and 178 (59%) had severe stenosis in at least one vessel at ICA. The sensitivity, specificity, positive predictive value, and negative predictive value were 97.1%, 97.9%, 99% and 93.9% for moderate stenosis, and 98.9%, 93.4%, 95.7% and 98.3%, for severe stenosis, on a per-patient basis. Conclusion: The combination of HD-CTCA and experienced readers applied to a high-risk population, results in high diagnostic accuracy comparable to ICA. Modern generation CT systems in experienced hands might be considered for an expanded role. - Highlights: • Diagnostic accuracy of High-Definition CT Angiography (HD-CTCA) has been assessed. • Invasive Coronary angiography (ICA) is the reference standard. • Diagnostic accuracy of HD-CTCA is comparable to ICA. • Diagnostic accuracy is not affected by coronary calcium or stents. • HD-CTCA provides a non-invasive alternative in high-risk patients.

  15. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  16. SU-E-T-455: Impact of Different Independent Dose Verification Software Programs for Secondary Check

    International Nuclear Information System (INIS)

    Itano, M; Yamazaki, T; Kosaka, M; Kobayashi, N; Yamashita, M; Ishibashi, S; Higuchi, Y; Tachibana, H

    2015-01-01

    Purpose: There have been many reports for different dose calculation algorithms for treatment planning system (TPS). Independent dose verification program (IndpPro) is essential to verify clinical plans from the TPS. However, the accuracy of different independent dose verification programs was not evident. We conducted a multi-institutional study to reveal the impact of different IndpPros using different TPSs. Methods: Three institutes participated in this study. They used two different IndpPros (RADCALC and Simple MU Analysis (SMU), which implemented the Clarkson algorithm. RADCALC needed the input of radiological path length (RPL) computed by the TPSs (Eclipse or Pinnacle3). SMU used CT images to compute the RPL independently from TPS). An ion-chamber measurement in water-equivalent phantom was performed to evaluate the accuracy of two IndpPros and the TPS in each institute. Next, the accuracy of dose calculation using the two IndpPros compared to TPS was assessed in clinical plan. Results: The accuracy of IndpPros and the TPSs in the homogenous phantom was +/−1% variation to the measurement. 1543 treatment fields were collected from the patients treated in the institutes. The RADCALC showed better accuracy (0.9 ± 2.2 %) than the SMU (1.7 ± 2.1 %). However, the accuracy was dependent on the TPS (Eclipse: 0.5%, Pinnacle3: 1.0%). The accuracy of RADCALC with Eclipse was similar to that of SMU in one of the institute. Conclusion: Depending on independent dose verification program, the accuracy shows systematic dose accuracy variation even though the measurement comparison showed a similar variation. The variation was affected by radiological path length calculation. IndpPro with Pinnacle3 has different variation because Pinnacle3 computed the RPL using physical density. Eclipse and SMU uses electron density, though

  17. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  18. Systematic Calibration for Ultra-High Accuracy Inertial Measurement Units

    Directory of Open Access Journals (Sweden)

    Qingzhong Cai

    2016-06-01

    Full Text Available An inertial navigation system (INS has been widely used in challenging GPS environments. With the rapid development of modern physics, an atomic gyroscope will come into use in the near future with a predicted accuracy of 5 × 10−6°/h or better. However, existing calibration methods and devices can not satisfy the accuracy requirements of future ultra-high accuracy inertial sensors. In this paper, an improved calibration model is established by introducing gyro g-sensitivity errors, accelerometer cross-coupling errors and lever arm errors. A systematic calibration method is proposed based on a 51-state Kalman filter and smoother. Simulation results show that the proposed calibration method can realize the estimation of all the parameters using a common dual-axis turntable. Laboratory and sailing tests prove that the position accuracy in a five-day inertial navigation can be improved about 8% by the proposed calibration method. The accuracy can be improved at least 20% when the position accuracy of the atomic gyro INS can reach a level of 0.1 nautical miles/5 d. Compared with the existing calibration methods, the proposed method, with more error sources and high order small error parameters calibrated for ultra-high accuracy inertial measurement units (IMUs using common turntables, has a great application potential in future atomic gyro INSs.

  19. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    Science.gov (United States)

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  20. High-precision prostate cancer irradiation by clinical application of an offline patient setup verification procedure, using portal imaging

    International Nuclear Information System (INIS)

    Bel, Arjan; Vos, Pieter H.; Rodrigus, Patrick T. R.; Creutzberg, Carien L.; Visser, Andries G.; Stroom, Joep C.; Lebesque, Joos V.

    1996-01-01

    Purpose: To investigate in three institutions, The Netherlands Cancer Institute (Antoni van Leeuwenhoek Huis [AvL]), Dr. Daniel den Hoed Cancer Center (DDHC), and Dr, Bernard Verbeeten Institute (BVI), how much the patient setup accuracy for irradiation of prostate cancer can be improved by an offline setup verification and correction procedure, using portal imaging. Methods and Materials: The verification procedure consisted of two stages. During the first stage, setup deviations were measured during a number (N max ) of consecutive initial treatment sessions. The length of the average three dimensional (3D) setup deviation vector was compared with an action level for corrections, which shrunk with the number of setup measurements. After a correction was applied, N max measurements had to be performed again. Each institution chose different values for the initial action level (6, 9, and 10 mm) and N max (2 and 4). The choice of these parameters was based on a simulation of the procedure, using as input preestimated values of random and systematic deviations in each institution. During the second stage of the procedure, with weekly setup measurements, the AvL used a different criterion ('outlier detection') for corrective actions than the DDHC and the BVI ('sliding average'). After each correction the first stage of the procedure was restarted. The procedure was tested for 151 patients (62 in AvL, 47 in DDHC, and 42 in BVI) treated for prostate carcinoma. Treatment techniques and portal image acquisition and analysis were different in each institution. Results: The actual distributions of random and systematic deviations without corrections were estimated by eliminating the effect of the corrections. The percentage of mean (systematic) 3D deviations larger than 5 mm was 26% for the AvL and the DDHC, and 36% for the BVI. The setup accuracy after application of the procedure was considerably improved (percentage of mean 3D deviations larger than 5 mm was 1.6% in the

  1. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    Science.gov (United States)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  2. Accuracy of automated measurement and verification (M&V) techniques for energy savings in commercial buildings

    International Nuclear Information System (INIS)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine; Sohn, Michael D.; Jump, David; Fernandes, Samuel

    2016-01-01

    Highlights: • A testing procedure and metrics to asses the performance of whole-building M&V methods is presented. • The accuracy of ten baseline models is evaluated on measured data from 537 commercial buildings. • The impact of reducing the training period from 12-months to shorter time horizon is examined. - Abstract: Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming manual data acquisition and often do not deliver results until years after the program period has ended. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. These meter- and software-based approaches, increasingly referred to as “M&V 2.0”, are the subject of surging industry interest, particularly in the context of utility energy efficiency programs. Program administrators, evaluators, and regulators are asking how M&V 2.0 compares with more traditional methods, how proprietary software can be transparently performance tested, how these techniques can be integrated into the next generation of whole-building focused efficiency programs. This paper expands recent analyses of public-domain whole-building M&V methods, focusing on more novel M&V 2.0 modeling approaches that are used in commercial technologies, as well as approaches that are documented in the literature, and/or developed by the academic building research community. We present a testing procedure and metrics to assess the performance of whole-building M&V methods. We then illustrate the test procedure

  3. Towards real-time VMAT verification using a prototype, high-speed CMOS active pixel sensor.

    Science.gov (United States)

    Zin, Hafiz M; Harris, Emma J; Osmond, John P F; Allinson, Nigel M; Evans, Philip M

    2013-05-21

    This work investigates the feasibility of using a prototype complementary metal oxide semiconductor active pixel sensor (CMOS APS) for real-time verification of volumetric modulated arc therapy (VMAT) treatment. The prototype CMOS APS used region of interest read out on the chip to allow fast imaging of up to 403.6 frames per second (f/s). The sensor was made larger (5.4 cm × 5.4 cm) using recent advances in photolithographic technique but retains fast imaging speed with the sensor's regional read out. There is a paradigm shift in radiotherapy treatment verification with the advent of advanced treatment techniques such as VMAT. This work has demonstrated that the APS can track multi leaf collimator (MLC) leaves moving at 18 mm s(-1) with an automatic edge tracking algorithm at accuracy better than 1.0 mm even at the fastest imaging speed. Evaluation of the measured fluence distribution for an example VMAT delivery sampled at 50.4 f/s was shown to agree well with the planned fluence distribution, with an average gamma pass rate of 96% at 3%/3 mm. The MLC leaves motion and linac pulse rate variation delivered throughout the VMAT treatment can also be measured. The results demonstrate the potential of CMOS APS technology as a real-time radiotherapy dosimeter for delivery of complex treatments such as VMAT.

  4. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  5. Advanced Collimators for Verification of the Pu Isotopic Composition in Fresh Fuel by High Resolution Gamma Spectrometry

    International Nuclear Information System (INIS)

    Lebrun, Alain; Berlizov, Andriy

    2013-06-01

    IAEA verification of the nuclear material contained in fresh nuclear fuel assemblies is usually based on neutron coincidence counting (NCC). In the case of uranium fuel, active NCC provides the total content of uranium-235 per unit of length which, combined with active length verification, fully supports the verification. In the case of plutonium fuel, passive NCC provides the plutonium-240 equivalent content which needs to be associated with a measurement of the isotopic composition and active length measurement to complete the verification. Plutonium isotopic composition is verified by high resolution gamma spectrometry (HRGS) applied on fresh fuel assemblies assuming all fuel rods are fabricated from the same plutonium batch. For particular verifications when such an assumption cannot be reasonably made, there is a need to optimize the HRGS measurement so that contributions of internal rods to the recorded spectrum are maximized, thus providing equally strong verification of the internal fuel rods. This paper reports on simulation work carried out to design special collimators aimed at reducing the relative contribution of external fuel rods while enhancing the signal recorded from internal rods. Both cases of square lattices (e.g. 17x17 pressurized water reactor (PWR) fuel) and hexagonal compact lattices (e.g. BN800 fast neutron reactor (FNR) fuel) have been addressed. In the case of PWR lattices, the relatively large optical path to internal pins compensates for low plutonium concentrations and the large size of the fuel assemblies. A special collimator based on multiple, asymmetrical, vertical slots allows recording a spectrum from internal rods only when needed. In the FNR case, the triangular lattice is much more compact and the optical path to internal rods is very narrow. However, higher plutonium concentration and use of high energy ranges allow the verification of internal rods to be significantly strengthened. Encouraging results from the simulation

  6. SU-C-207A-04: Accuracy of Acoustic-Based Proton Range Verification in Water

    International Nuclear Information System (INIS)

    Jones, KC; Sehgal, CM; Avery, S; Vander Stappen, F

    2016-01-01

    Purpose: To determine the accuracy and dose required for acoustic-based proton range verification (protoacoustics) in water. Methods: Proton pulses with 17 µs FWHM and instantaneous currents of 480 nA (5.6 × 10 7 protons/pulse, 8.9 cGy/pulse) were generated by a clinical, hospital-based cyclotron at the University of Pennsylvania. The protoacoustic signal generated in a water phantom by the 190 MeV proton pulses was measured with a hydrophone placed at multiple known positions surrounding the dose deposition. The background random noise was measured. The protoacoustic signal was simulated to compare to the experiments. Results: The maximum protoacoustic signal amplitude at 5 cm distance was 5.2 mPa per 1 × 10 7 protons (1.6 cGy at the Bragg peak). The background random noise of the measurement was 27 mPa. Comparison between simulation and experiment indicates that the hydrophone introduced a delay of 2.4 µs. For acoustic data collected with a signal-to-noise ratio (SNR) of 21, deconvolution of the protoacoustic signal with the proton pulse provided the most precise time-of-flight range measurement (standard deviation of 2.0 mm), but a systematic error (−4.5 mm) was observed. Conclusion: Based on water phantom measurements at a clinical hospital-based cyclotron, protoacoustics is a potential technique for measuring the proton Bragg peak range with 2.0 mm standard deviation. Simultaneous use of multiple detectors is expected to reduce the standard deviation, but calibration is required to remove systematic error. Based on the measured background noise and protoacoustic amplitude, a SNR of 5.3 is projected for a deposited dose of 2 Gy.

  7. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X; Yin, Y; Lin, X [Shandong Cancer Hospital and Institute, China, Jinan, Shandong (China)

    2016-06-15

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system. Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.

  8. Translating Activity Diagram from Duration Calculus for Modeling of Real-Time Systems and its Formal Verification using UPPAAL and DiVinE

    Directory of Open Access Journals (Sweden)

    Muhammad Abdul Basit Ur Rehman

    2016-01-01

    Full Text Available The RTS (Real-Time Systems are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus implementaion based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost.

  9. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    Rahim, M.A.B.U.; Arif, F.

    2016-01-01

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  10. Verification of Positional Accuracy of ZVS3003 Geodetic Control ...

    African Journals Online (AJOL)

    The International GPS Service (IGS) has provided GPS orbit products to the scientific community with increased precision and timeliness. Many users interested in geodetic positioning have adopted the IGS precise orbits to achieve centimeter level accuracy and ensure long-term reference frame stability. Positioning with ...

  11. Independent verification of monitor unit calculation for radiation treatment planning system.

    Science.gov (United States)

    Chen, Li; Chen, Li-Xin; Huang, Shao-Min; Sun, Wen-Zhao; Sun, Hong-Qiang; Deng, Xiao-Wu

    2010-02-01

    To ensure the accuracy of dose calculation for radiation treatment plans is an important part of quality assurance (QA) procedures for radiotherapy. This study evaluated the Monitor Units (MU) calculation accuracy of a third-party QA software and a 3-dimensional treatment planning system (3D TPS), to investigate the feasibility and reliability of independent verification for radiation treatment planning. Test plans in a homogenous phantom were designed with 3-D TPS, according to the International Atomic Energy Agency (IAEA) Technical Report No. 430, including open, blocked, wedge, and multileaf collimator (MLC) fields. Test plans were delivered and measured in the phantom. The delivered doses were input to the QA software and the independent calculated MUs were compared with delivery. All test plans were verified with independent calculation and phantom measurements separately, and the differences of the two kinds of verification were then compared. The deviation of the independent calculation to the measurements was (0.1 +/- 0.9)%, the biggest difference fell onto the plans that used block and wedge fields (2.0%). The mean MU difference between the TPS and the QA software was (0.6 +/- 1.0)%, ranging from -0.8% to 2.8%. The deviation in dose of the TPS calculation compared to the measurements was (-0.2 +/- 1.7)%, ranging from -3.9% to 2.9%. MU accuracy of the third-party QA software is clinically acceptable. Similar results were achieved with the independent calculations and the phantom measurements for all test plans. The tested independent calculation software can be used as an efficient tool for TPS plan verification.

  12. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  13. Calibration and verification of thermographic cameras for geometric measurements

    Science.gov (United States)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  14. The verification of neutron activation analysis support system (cooperative research)

    Energy Technology Data Exchange (ETDEWEB)

    Sasajima, Fumio; Ichimura, Shigeju; Ohtomo, Akitoshi; Takayanagi, Masaji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sawahata, Hiroyuki; Ito, Yasuo [Tokyo Univ. (Japan). Research Center for Nuclear Science and Technology; Onizawa, Kouji [Radiation Application Development Association, Tokai, Ibaraki (Japan)

    2000-12-01

    Neutron activation analysis support system is the system in which even the user who has not much experience in the neutron activation analysis can conveniently and accurately carry out the multi-element analysis of the sample. In this verification test, subjects such functions, usability, precision and accuracy of the analysis and etc. of the neutron activation analysis support system were confirmed. As a method of the verification test, it was carried out using irradiation device, measuring device, automatic sample changer and analyzer equipped in the JRR-3M PN-3 facility, and analysis software KAYZERO/SOLCOI based on the k{sub 0} method. With these equipments, calibration of the germanium detector, measurement of the parameter of the irradiation field and analysis of three kinds of environmental standard sample were carried out. The k{sub 0} method adopted in this system is primarily utilized in Europe recently, and it is the analysis method, which can conveniently and accurately carried out the multi-element analysis of the sample without requiring individual comparison standard sample. By this system, total 28 elements were determined quantitatively, and 16 elements with the value guaranteed as analytical data of the NIST (National Institute of Standards and Technology) environment standard sample were analyzed in the accuracy within 15%. This report describes content and verification result of neutron activation support system. (author)

  15. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  16. Online pretreatment verification of high-dose rate brachytherapy using an imaging panel

    Science.gov (United States)

    Fonseca, Gabriel P.; Podesta, Mark; Bellezzo, Murillo; Van den Bosch, Michiel R.; Lutgens, Ludy; Vanneste, Ben G. L.; Voncken, Robert; Van Limbergen, Evert J.; Reniers, Brigitte; Verhaegen, Frank

    2017-07-01

    Brachytherapy is employed to treat a wide variety of cancers. However, an accurate treatment verification method is currently not available. This study describes a pre-treatment verification system that uses an imaging panel (IP) to verify important aspects of the treatment plan. A detailed modelling of the IP was only possible with an extensive calibration performed using a robotic arm. Irradiations were performed with a high dose rate (HDR) 192Ir source within a water phantom. An empirical fit was applied to measure the distance between the source and the detector so 3D Cartesian coordinates of the dwell positions can be obtained using a single panel. The IP acquires 7.14 fps to verify the dwell times, dwell positions and air kerma strength (Sk). A gynecological applicator was used to create a treatment plan that was registered with a CT image of the water phantom used during the experiments for verification purposes. Errors (shifts, exchanged connections and wrong dwell times) were simulated to verify the proposed verification system. Cartesian source positions (panel measurement plane) have a standard deviation of about 0.02 cm. The measured distance between the source and the panel (z-coordinate) have a standard deviation up to 0.16 cm and maximum absolute error of  ≈0.6 cm if the signal is close to sensitive limit of the panel. The average response of the panel is very linear with Sk. Therefore, Sk measurements can be performed with relatively small errors. The measured dwell times show a maximum error of 0.2 s which is consistent with the acquisition rate of the panel. All simulated errors were clearly identified by the proposed system. The use of IPs is not common in brachytherapy, however, it provides considerable advantages. It was demonstrated that the IP can accurately measure Sk, dwell times and dwell positions.

  17. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  18. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  19. Partial verification bias and incorporation bias affected accuracy estimates of diagnostic studies for biomarkers that were part of an existing composite gold standard.

    Science.gov (United States)

    Karch, Annika; Koch, Armin; Zapf, Antonia; Zerr, Inga; Karch, André

    2016-10-01

    To investigate how choice of gold standard biases estimates of sensitivity and specificity in studies reassessing the diagnostic accuracy of biomarkers that are already part of a lifetime composite gold standard (CGS). We performed a simulation study based on the real-life example of the biomarker "protein 14-3-3" used for diagnosing Creutzfeldt-Jakob disease. Three different types of gold standard were compared: perfect gold standard "autopsy" (available in a small fraction only; prone to partial verification bias), lifetime CGS (including the biomarker under investigation; prone to incorporation bias), and "best available" gold standard (autopsy if available, otherwise CGS). Sensitivity was unbiased when comparing 14-3-3 with autopsy but overestimated when using CGS or "best available" gold standard. Specificity of 14-3-3 was underestimated in scenarios comparing 14-3-3 with autopsy (up to 24%). In contrast, overestimation (up to 20%) was observed for specificity compared with CGS; this could be reduced to 0-10% when using the "best available" gold standard. Choice of gold standard affects considerably estimates of diagnostic accuracy. Using the "best available" gold standard (autopsy where available, otherwise CGS) leads to valid estimates of specificity, whereas sensitivity is estimated best when tested against autopsy alone. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  1. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  2. Research on Horizontal Accuracy Method of High Spatial Resolution Remotely Sensed Orthophoto Image

    Science.gov (United States)

    Xu, Y. M.; Zhang, J. X.; Yu, F.; Dong, S.

    2018-04-01

    At present, in the inspection and acceptance of high spatial resolution remotly sensed orthophoto image, the horizontal accuracy detection is testing and evaluating the accuracy of images, which mostly based on a set of testing points with the same accuracy and reliability. However, it is difficult to get a set of testing points with the same accuracy and reliability in the areas where the field measurement is difficult and the reference data with high accuracy is not enough. So it is difficult to test and evaluate the horizontal accuracy of the orthophoto image. The uncertainty of the horizontal accuracy has become a bottleneck for the application of satellite borne high-resolution remote sensing image and the scope of service expansion. Therefore, this paper proposes a new method to test the horizontal accuracy of orthophoto image. This method using the testing points with different accuracy and reliability. These points' source is high accuracy reference data and field measurement. The new method solves the horizontal accuracy detection of the orthophoto image in the difficult areas and provides the basis for providing reliable orthophoto images to the users.

  3. A Study on Accuracy Improvement of Dual Micro Patterns Using Magnetic Abrasive Deburring

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Dong-Hyun; Kwak, Jae-Seob [Pukyong Nat’l Univ., Busan (Korea, Republic of)

    2016-11-15

    In recent times, the requirement of a micro pattern on the surface of products has been increasing, and high precision in the fabrication of the pattern is required. Hence, in this study, dual micro patterns were fabricated on a cylindrical workpiece, and deburring was performed by magnetic abrasive deburring (MAD) process. A prediction model was developed, and the MAD process was optimized using the response surface method. When the predicted values were compared with the experimental results, the average prediction error was found to be approximately 7%. Experimental verification shows fabrication of high accuracy dual micro pattern and reliability of prediction model.

  4. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  5. High accuracy 3-D laser radar

    DEFF Research Database (Denmark)

    Busck, Jens; Heiselberg, Henning

    2004-01-01

    We have developed a mono-static staring 3-D laser radar based on gated viewing with range accuracy below 1 m at 10 m and 1 cm at 100. We use a high sensitivity, fast, intensified CCD camera, and a Nd:Yag passively Q-switched 32.4 kHz pulsed green laser at 532 nm. The CCD has 752x582 pixels. Camera...

  6. High accuracy autonomous navigation using the global positioning system (GPS)

    Science.gov (United States)

    Truong, Son H.; Hart, Roger C.; Shoan, Wendy C.; Wood, Terri; Long, Anne C.; Oza, Dipak H.; Lee, Taesul

    1997-01-01

    The application of global positioning system (GPS) technology to the improvement of the accuracy and economy of spacecraft navigation, is reported. High-accuracy autonomous navigation algorithms are currently being qualified in conjunction with the GPS attitude determination flyer (GADFLY) experiment for the small satellite technology initiative Lewis spacecraft. Preflight performance assessments indicated that these algorithms are able to provide a real time total position accuracy of better than 10 m and a velocity accuracy of better than 0.01 m/s, with selective availability at typical levels. It is expected that the position accuracy will be increased to 2 m if corrections are provided by the GPS wide area augmentation system.

  7. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  8. Verification of the plan dosimetry for high dose rate brachytherapy using metal-oxide-semiconductor field effect transistor detectors

    International Nuclear Information System (INIS)

    Qi Zhenyu; Deng Xiaowu; Huang Shaomin; Lu Jie; Lerch, Michael; Cutajar, Dean; Rosenfeld, Anatoly

    2007-01-01

    The feasibility of a recently designed metal-oxide-semiconductor field effect transistor (MOSFET) dosimetry system for dose verification of high dose rate (HDR) brachytherapy treatment planning was investigated. MOSFET detectors were calibrated with a 0.6 cm 3 NE-2571 Farmer-type ionization chamber in water. Key characteristics of the MOSFET detectors, such as the energy dependence, that will affect phantom measurements with HDR 192 Ir sources were measured. The MOSFET detector was then applied to verify the dosimetric accuracy of HDR brachytherapy treatments in a custom-made water phantom. Three MOSFET detectors were calibrated independently, with the calibration factors ranging from 0.187 to 0.215 cGy/mV. A distance dependent energy response was observed, significant within 2 cm from the source. The new MOSFET detector has a good reproducibility ( 2 =1). It was observed that the MOSFET detectors had a linear response to dose until the threshold voltage reached approximately 24 V for 192 Ir source measurements. Further comparison of phantom measurements using MOSFET detectors with dose calculations by a commercial treatment planning system for computed tomography-based brachytherapy treatment plans showed that the mean relative deviation was 2.2±0.2% for dose points 1 cm away from the source and 2.0±0.1% for dose points located 2 cm away. The percentage deviations between the measured doses and the planned doses were below 5% for all the measurements. The MOSFET detector, with its advantages of small physical size and ease of use, is a reliable tool for quality assurance of HDR brachytherapy. The phantom verification method described here is universal and can be applied to other HDR brachytherapy treatments

  9. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  10. High accuracy wavelength calibration for a scanning visible spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Scotti, Filippo; Bell, Ronald E. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

    2010-10-15

    Spectroscopic applications for plasma velocity measurements often require wavelength accuracies {<=}0.2 A. An automated calibration, which is stable over time and environmental conditions without the need to recalibrate after each grating movement, was developed for a scanning spectrometer to achieve high wavelength accuracy over the visible spectrum. This method fits all relevant spectrometer parameters using multiple calibration spectra. With a stepping-motor controlled sine drive, an accuracy of {approx}0.25 A has been demonstrated. With the addition of a high resolution (0.075 arc sec) optical encoder on the grating stage, greater precision ({approx}0.005 A) is possible, allowing absolute velocity measurements within {approx}0.3 km/s. This level of precision requires monitoring of atmospheric temperature and pressure and of grating bulk temperature to correct for changes in the refractive index of air and the groove density, respectively.

  11. Switched-capacitor techniques for high-accuracy filter and ADC design

    NARCIS (Netherlands)

    Quinn, P.J.; Roermund, van A.H.M.

    2007-01-01

    Switched capacitor (SC) techniques are well proven to be excellent candidates for implementing critical analogue functions with high accuracy, surpassing other analogue techniques when embedded in mixed-signal CMOS VLSI. Conventional SC circuits are primarily limited in accuracy by a) capacitor

  12. Independent verification of the delivered dose in High-Dose Rate (HDR) brachytherapy

    International Nuclear Information System (INIS)

    Portillo, P.; Feld, D.; Kessler, J.

    2009-01-01

    An important aspect of a Quality Assurance program in Clinical Dosimetry is an independent verification of the dosimetric calculation done by the Treatment Planning System for each radiation treatment. The present paper is aimed at creating a spreadsheet for the verification of the dose recorded at a point of an implant with radioactive sources and HDR in gynecological injuries. An 192 Ir source automatic differed loading equipment, GammaMedplus model, Varian Medical System with HDR installed at the Angel H. Roffo Oncology Institute has been used. The planning system implemented for getting the dose distribution is the BraquiVision. The sources coordinates as well as those of the calculation point (Rectum) are entered into the Excel-devised verification program by assuming the existence of a point source in each one of the applicators' positions. Such calculation point has been selected as the rectum is an organ at risk, therefore determining the treatment planning. The dose verification is performed at points standing at a sources distance having at least twice the active length of such sources, so they may be regarded as point sources. Most of the sources used in HDR brachytherapy with 192 Ir have a 5 mm active length for all equipment brands. Consequently, the dose verification distance must be at least of 10 mm. (author)

  13. Leaf trajectory verification during dynamic intensity modulated radiotherapy using an amorphous silicon flat panel imager

    International Nuclear Information System (INIS)

    Sonke, Jan-Jakob; Ploeger, Lennert S.; Brand, Bob; Smitsmans, Monique H.P.; Herk, Marcel van

    2004-01-01

    An independent verification of the leaf trajectories during each treatment fraction improves the safety of IMRT delivery. In order to verify dynamic IMRT with an electronic portal imaging device (EPID), the EPID response should be accurate and fast such that the effect of motion blurring on the detected moving field edge position is limited. In the past, it was shown that the errors in the detected position of a moving field edge determined by a scanning liquid-filled ionization chamber (SLIC) EPID are negligible in clinical practice. Furthermore, a method for leaf trajectory verification during dynamic IMRT was successfully applied using such an EPID. EPIDs based on amorphous silicon (a-Si) arrays are now widely available. Such a-Si flat panel imagers (FPIs) produce portal images with superior image quality compared to other portal imaging systems, but they have not yet been used for leaf trajectory verification during dynamic IMRT. The aim of this study is to quantify the effect of motion distortion and motion blurring on the detection accuracy of a moving field edge for an Elekta iViewGT a-Si FPI and to investigate its applicability for the leaf trajectory verification during dynamic IMRT. We found that the detection error for a moving field edge to be smaller than 0.025 cm at a speed of 0.8 cm/s. Hence, the effect of motion blurring on the detection accuracy of a moving field edge is negligible in clinical practice. Furthermore, the a-Si FPI was successfully applied for the verification of dynamic IMRT. The verification method revealed a delay in the control system of the experimental DMLC that was also found using a SLIC EPID, resulting in leaf positional errors of 0.7 cm at a leaf speed of 0.8 cm/s

  14. Verification of Treatment Planning System (TPS) on Beam Axis of Co-60 Teletherapy

    International Nuclear Information System (INIS)

    Nunung-Nuraeni; Budhy-Kurniawan; Purwanto; Sugiyantari; Heru-Prasetio; Nasukha

    2001-01-01

    Cancer diseases up to now can be able to be treated by using surgery, chemotherapy and radiotherapy. The need of high level precision and accuracy on radiation dose are very important task. One of task is verification of Treatment Planning System (Tps) to the treatment of patients. The research has been done to verify Tps on beam exis of teletherapy Co-60. Result found that the different between Tps and measurements are about -2.682 % to 1.918% for simple geometry and homogeneous material, 5.278 % to 4.990 % for complex geometry, and -3.202 % to -2.090 % for more complex geometry. (author)

  15. High-accuracy mass spectrometry for fundamental studies.

    Science.gov (United States)

    Kluge, H-Jürgen

    2010-01-01

    Mass spectrometry for fundamental studies in metrology and atomic, nuclear and particle physics requires extreme sensitivity and efficiency as well as ultimate resolving power and accuracy. An overview will be given on the global status of high-accuracy mass spectrometry for fundamental physics and metrology. Three quite different examples of modern mass spectrometric experiments in physics are presented: (i) the retardation spectrometer KATRIN at the Forschungszentrum Karlsruhe, employing electrostatic filtering in combination with magnetic-adiabatic collimation-the biggest mass spectrometer for determining the smallest mass, i.e. the mass of the electron anti-neutrino, (ii) the Experimental Cooler-Storage Ring at GSI-a mass spectrometer of medium size, relative to other accelerators, for determining medium-heavy masses and (iii) the Penning trap facility, SHIPTRAP, at GSI-the smallest mass spectrometer for determining the heaviest masses, those of super-heavy elements. Finally, a short view into the future will address the GSI project HITRAP at GSI for fundamental studies with highly-charged ions.

  16. Accuracy Assessment for the Three-Dimensional Coordinates by High-Speed Videogrammetric Measurement

    Directory of Open Access Journals (Sweden)

    Xianglei Liu

    2018-01-01

    Full Text Available High-speed CMOS camera is a new kind of transducer to make the videogrammetric measurement for monitoring the displacement of high-speed shaking table structure. The purpose of this paper is to validate the three-dimensional coordinate accuracy of the shaking table structure acquired from the presented high-speed videogrammetric measuring system. In the paper, all of the key intermediate links are discussed, including the high-speed CMOS videogrammetric measurement system, the layout of the control network, the elliptical target detection, and the accuracy validation of final 3D spatial results. Through the accuracy analysis, the submillimeter accuracy can be made for the final the three-dimensional spatial coordinates which certify that the proposed high-speed videogrammetric technique is a better alternative technique which can replace the traditional transducer technique for monitoring the dynamic response for the shaking table structure.

  17. Accuracy Analysis of a Box-wing Theoretical SRP Model

    Science.gov (United States)

    Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui

    2016-07-01

    For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.

  18. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  19. A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables

    Science.gov (United States)

    Huang, Laura X.; Isaac, George A.; Sheng, Grant

    2014-01-01

    This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.

  20. Development of Genetic Markers for Triploid Verification of the Pacific Oyster,

    Directory of Open Access Journals (Sweden)

    Jung-Ha Kang

    2013-07-01

    Full Text Available The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters.

  1. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  2. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  3. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  4. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    Energy Technology Data Exchange (ETDEWEB)

    Ding, A; Han, B; Bush, K; Wang, L; Xing, L [Stanford University School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluence by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.

  5. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    International Nuclear Information System (INIS)

    Ding, A; Han, B; Bush, K; Wang, L; Xing, L

    2015-01-01

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluence by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient

  6. Double-Difference Tomography for Sequestration MVA [monitoring, verification, and accounting

    Energy Technology Data Exchange (ETDEWEB)

    Westman, Erik [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States)

    2012-12-31

    Analysis of synthetic data was performed to determine the most cost-effective tomographic monitoring system for a geologic carbon sequestration injection site. Double-difference tomographic inversion was performed on 125 synthetic data sets: five stages of CO2 plume growth, five seismic event regions, and five geophone arrays. Each resulting velocity model was compared quantitatively to its respective synthetic velocity model to determine an accuracy value. The results were examined to determine a relationship between cost and accuracy in monitoring, verification, and accounting applications using double-difference tomography. The geophone arrays with widely-varying geophone locations, both laterally and vertically, performed best. Additionally, double difference seismic tomography was performed using travel time data from a carbon sequestration site at the Aneth oil field in southeast Utah as part of a Department of Energy initiative on monitoring, verification, and accounting (MVA) of sequestered CO2. A total of 1,211 seismic events were recorded from a borehole array consisting of 22 geophones. Artificial velocity models were created to determine the ease with which different CO2 plume locations and sizes can be detected. Most likely because of the poor geophone arrangement, a low velocity zone in the Desert Creek reservoir can only be detected when regions of test site containing the highest ray path coverage are considered. MVA accuracy and precision may be improved through the use of a receiver array that provides more comprehensive ray path coverage.

  7. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  8. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    Energy Technology Data Exchange (ETDEWEB)

    Saotome, Naoya, E-mail: naosao@nirs.go.jp; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji [Department of Research Center for Charged Particle Therapy, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage-ku, Chiba 263-8555 (Japan)

    2016-04-15

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  9. Technical Note: Range verification system using edge detection method for a scintillator and a CCD camera system

    International Nuclear Information System (INIS)

    Saotome, Naoya; Furukawa, Takuji; Hara, Yousuke; Mizushima, Kota; Tansho, Ryohei; Saraya, Yuichi; Shirai, Toshiyuki; Noda, Koji

    2016-01-01

    Purpose: Three-dimensional irradiation with a scanned carbon-ion beam has been performed from 2011 at the authors’ facility. The authors have developed the rotating-gantry equipped with the scanning irradiation system. The number of combinations of beam properties to measure for the commissioning is more than 7200, i.e., 201 energy steps, 3 intensities, and 12 gantry angles. To compress the commissioning time, quick and simple range verification system is required. In this work, the authors develop a quick range verification system using scintillator and charge-coupled device (CCD) camera and estimate the accuracy of the range verification. Methods: A cylindrical plastic scintillator block and a CCD camera were installed on the black box. The optical spatial resolution of the system is 0.2 mm/pixel. The camera control system was connected and communicates with the measurement system that is part of the scanning system. The range was determined by image processing. Reference range for each energy beam was determined by a difference of Gaussian (DOG) method and the 80% of distal dose of the depth-dose distribution that were measured by a large parallel-plate ionization chamber. The authors compared a threshold method and a DOG method. Results: The authors found that the edge detection method (i.e., the DOG method) is best for the range detection. The accuracy of range detection using this system is within 0.2 mm, and the reproducibility of the same energy measurement is within 0.1 mm without setup error. Conclusions: The results of this study demonstrate that the authors’ range check system is capable of quick and easy range verification with sufficient accuracy.

  10. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  11. Discussion on verification criterion and method of human factors engineering for nuclear power plant controller

    International Nuclear Information System (INIS)

    Yang Hualong; Liu Yanzi; Jia Ming; Huang Weijun

    2014-01-01

    In order to prevent or reduce human error and ensure the safe operation of nuclear power plants, control device should be verified from the perspective of human factors engineering (HFE). The domestic and international human factors engineering guidelines about nuclear power plant controller were considered, the verification criterion and method of human factors engineering for nuclear power plant controller were discussed and the application examples were provided for reference in this paper. The results show that the appropriate verification criterion and method should be selected to ensure the objectivity and accuracy of the conclusion. (authors)

  12. Image Positioning Accuracy Analysis for Super Low Altitude Remote Sensing Satellites

    Directory of Open Access Journals (Sweden)

    Ming Xu

    2012-10-01

    Full Text Available Super low altitude remote sensing satellites maintain lower flight altitudes by means of ion propulsion in order to improve image resolution and positioning accuracy. The use of engineering data in design for achieving image positioning accuracy is discussed in this paper based on the principles of the photogrammetry theory. The exact line-of-sight rebuilding of each detection element and this direction precisely intersecting with the Earth's elliptical when the camera on the satellite is imaging are both ensured by the combined design of key parameters. These parameters include: orbit determination accuracy, attitude determination accuracy, camera exposure time, accurately synchronizing the reception of ephemeris with attitude data, geometric calibration and precise orbit verification. Precise simulation calculations show that image positioning accuracy of super low altitude remote sensing satellites is not obviously improved. The attitude determination error of a satellite still restricts its positioning accuracy.

  13. Verification of Data Accuracy in Japan Congenital Cardiovascular Surgery Database Including Its Postprocedural Complication Reports.

    Science.gov (United States)

    Takahashi, Arata; Kumamaru, Hiraku; Tomotaki, Ai; Matsumura, Goki; Fukuchi, Eriko; Hirata, Yasutaka; Murakami, Arata; Hashimoto, Hideki; Ono, Minoru; Miyata, Hiroaki

    2018-03-01

    Japan Congenital Cardiovascluar Surgical Database (JCCVSD) is a nationwide registry whose data are used for health quality assessment and clinical research in Japan. We evaluated the completeness of case registration and the accuracy of recorded data components including postprocedural mortality and complications in the database via on-site data adjudication. We validated the records from JCCVSD 2010 to 2012 containing congenital cardiovascular surgery data performed in 111 facilities throughout Japan. We randomly chose nine facilities for site visit by the auditor team and conducted on-site data adjudication. We assessed whether the records in JCCVSD matched the data in the source materials. We identified 1,928 cases of eligible surgeries performed at the facilities, of which 1,910 were registered (99.1% completeness), with 6 cases of duplication and 1 inappropriate case registration. Data components including gender, age, and surgery time (hours) were highly accurate with 98% to 100% concordance. Mortality at discharge and at 30 and 90 postoperative days was 100% accurate. Among the five complications studied, reoperation was the most frequently observed, with 16 and 21 cases recorded in the database and source materials, respectively, having a sensitivity of 0.67 and a specificity of 0.99. Validation of JCCVSD database showed high registration completeness and high accuracy especially in the categorical data components. Adjudicated mortality was 100% accurate. While limited in numbers, the recorded cases of postoperative complications all had high specificities but had lower sensitivity (0.67-1.00). Continued activities for data quality improvement and assessment are necessary for optimizing the utility of these registries.

  14. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  15. Fast and High Accuracy Wire Scanner

    CERN Document Server

    Koujili, M; Koopman, J; Ramos, D; Sapinski, M; De Freitas, J; Ait Amira, Y; Djerdir, A

    2009-01-01

    Scanning of a high intensity particle beam imposes challenging requirements on a Wire Scanner system. It is expected to reach a scanning speed of 20 m.s-1 with a position accuracy of the order of 1 μm. In addition a timing accuracy better than 1 millisecond is needed. The adopted solution consists of a fork holding a wire rotating by a maximum of 200°. Fork, rotor and angular position sensor are mounted on the same axis and located in a chamber connected to the beam vacuum. The requirements imply the design of a system with extremely low vibration, vacuum compatibility, radiation and temperature tolerance. The adopted solution consists of a rotary brushless synchronous motor with the permanent magnet rotor installed inside of the vacuum chamber and the stator installed outside. The accurate position sensor will be mounted on the rotary shaft inside of the vacuum chamber, has to resist a bake-out temperature of 200°C and ionizing radiation up to a dozen of kGy/year. A digital feedback controller allows maxi...

  16. Real-Time Verification of a High-Dose-Rate Iridium 192 Source Position Using a Modified C-Arm Fluoroscope

    Energy Technology Data Exchange (ETDEWEB)

    Nose, Takayuki, E-mail: nose-takayuki@nms.ac.jp [Department of Radiation Oncology, Nippon Medical School Tamanagayama Hospital, Tama (Japan); Chatani, Masashi [Department of Radiation Oncology, Osaka Rosai Hospital, Sakai (Japan); Otani, Yuki [Department of Radiology, Kaizuka City Hospital, Kaizuka (Japan); Teshima, Teruki [Department of Radiation Oncology, Osaka Medical Center for Cancer and Cardiovascular Diseases, Osaka (Japan); Kumita, Shinichirou [Department of Radiology, Nippon Medical School Hospital, Tokyo (Japan)

    2017-03-15

    Purpose: High-dose-rate (HDR) brachytherapy misdeliveries can occur at any institution, and they can cause disastrous results. Even a patient's death has been reported. Misdeliveries could be avoided with real-time verification methods. In 1996, we developed a modified C-arm fluoroscopic verification of an HDR Iridium 192 source position prevent these misdeliveries. This method provided excellent image quality sufficient to detect errors, and it has been in clinical use at our institutions for 20 years. The purpose of the current study is to introduce the mechanisms and validity of our straightforward C-arm fluoroscopic verification method. Methods and Materials: Conventional X-ray fluoroscopic images are degraded by spurious signals and quantum noise from Iridium 192 photons, which make source verification impractical. To improve image quality, we quadrupled the C-arm fluoroscopic X-ray dose per pulse. The pulse rate was reduced by a factor of 4 to keep the average exposure compliant with Japanese medical regulations. The images were then displayed with quarter-frame rates. Results: Sufficient quality was obtained to enable observation of the source position relative to both the applicators and the anatomy. With this method, 2 errors were detected among 2031 treatment sessions for 370 patients within a 6-year period. Conclusions: With the use of a modified C-arm fluoroscopic verification method, treatment errors that were otherwise overlooked were detected in real time. This method should be given consideration for widespread use.

  17. Evaluation factors for verification and validation of low-level waste disposal site models

    International Nuclear Information System (INIS)

    Moran, M.S.; Mezga, L.J.

    1982-01-01

    The purpose of this paper is to identify general evaluation factors to be used to verify and validate LLW disposal site performance models in order to assess their site-specific applicability and to determine their accuracy and sensitivity. It is intended that the information contained in this paper be employed by model users involved with LLW site performance model verification and validation. It should not be construed as providing protocols, but rather as providing a framework for the preparation of specific protocols or procedures. A brief description of each evaluation factor is provided. The factors have been categorized according to recommended use during either the model verification or the model validation process. The general responsibilities of the developer and user are provided. In many cases it is difficult to separate the responsibilities of the developer and user, but the user is ultimately accountable for both verification and validation processes. 4 refs

  18. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  19. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  20. Dosimetric verification of the dynamic intensity modulated radiotherapy (IMR) of 21 patients

    International Nuclear Information System (INIS)

    Tsai, J.-S.; Engler, Mark J.; Ling, Marilyn N.; Wu, Julian; Kramer, Bradley; Fagundes, Marcio; Dipetrillo, Thomas; Wazer, David E.

    1996-01-01

    exposed to known doses, and a high speed, 300 dot/'' scanner driven by photoshop software. Film data were analyzed with NIH image software. Absolute dose verification was achieved with TLD in the anthropomorphic phantom and diodes and ion chambers in calibration phantom slabs. Phantom setup closely simulated the patient's CT and treatment setups. Interslab spaces for films and phantom position were chosen to best sample conformity of tumor prescription dose, and compliance of maximum measured dose in normal tissues to doses entered as constraints. Verifications applied to commission the system consisted of annealing the cost function for simulated targets in the anthropomorphic phantom, and then comparing planned with measured doses. Subsequently a 'hybrid' verification was performed where the beam set obtained from patient geometry was detached from patient anatomic files and applied to calculate doses in the phantoms, followed by a comparison of measured with planned doses. Phantom slabs and positions were carefully selected to obtain an average TMR to the gantry isocenter in the phantom within 2% of the average within the patient. In vivo dosimetry was obtained with TLD under 1 cm of bolus at the location of the maximum skin surface dose. Plans were reoptimized including the contour of the added bolus to improve the accuracy of the measurement. The average leakage dose was assumed to be 0.4% of the total monitor units of the treatment. Results: Verification of planned dose distributions simulated in phantom indicate agreement of planned with measured doses of ±5% throughout numerous transverse plane films of 18 of 21 patients. In three patients with unusually large and complex shaped tumors, planned monitor units were altered to compensate for verifications indicating up to 10% differences between planned and measured doses. TLD in the phantom indicated improved agreement of absolute dose of ±5%. However, the accuracy of initial 'hybrid' verifications of patient

  1. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  2. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  3. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  4. High-accuracy measurements of the normal specular reflectance

    International Nuclear Information System (INIS)

    Voarino, Philippe; Piombini, Herve; Sabary, Frederic; Marteau, Daniel; Dubard, Jimmy; Hameury, Jacques; Filtz, Jean Remy

    2008-01-01

    The French Laser Megajoule (LMJ) is designed and constructed by the French Commissariata l'Energie Atomique (CEA). Its amplifying section needs highly reflective multilayer mirrors for the flash lamps. To monitor and improve the coating process, the reflectors have to be characterized to high accuracy. The described spectrophotometer is designed to measure normal specular reflectance with high repeatability by using a small spot size of 100 μm. Results are compared with ellipsometric measurements. The instrument can also perform spatial characterization to detect coating nonuniformity

  5. A Self-Instructional Course in Student Financial Aid Administration. Module 13: Verification. Second Edition.

    Science.gov (United States)

    Washington Consulting Group, Inc., Washington, DC.

    Module 13 of the 17-module self-instructional course on student financial aid administration (designed for novice financial aid administrators and other institutional personnel) focuses on the verification procedure for checking the accuracy of applicant data used in making financial aid awards. The full course provides an introduction to the…

  6. Verification measurements of the IRMM-1027 and the IAEA large-sized dried (LSD) spikes

    International Nuclear Information System (INIS)

    Jakopic, R.; Aregbe, Y.; Richter, S.

    2017-01-01

    In the frame of the accountancy measurements of the fissile materials, reliable determinations of the plutonium and uranium content in spent nuclear fuel are required to comply with international safeguards agreements. Large-sized dried (LSD) spikes of enriched "2"3"5U and "2"3"9Pu for isotope dilution mass spectrometry (IDMS) analysis are routinely applied in reprocessing plants for this purpose. A correct characterisation of these elements is a pre-requirement for achieving high accuracy in IDMS analyses. This paper will present the results of external verification measurements of such LSD spikes performed by the European Commission and the International Atomic Energy Agency. (author)

  7. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  8. A high accuracy land use/cover retrieval system

    Directory of Open Access Journals (Sweden)

    Alaa Hefnawy

    2012-03-01

    Full Text Available The effects of spatial resolution on the accuracy of mapping land use/cover types have received increasing attention as a large number of multi-scale earth observation data become available. Although many methods of semi automated image classification of remotely sensed data have been established for improving the accuracy of land use/cover classification during the past 40 years, most of them were employed in single-resolution image classification, which led to unsatisfactory results. In this paper, we propose a multi-resolution fast adaptive content-based retrieval system of satellite images. Through our proposed system, we apply a Super Resolution technique for the Landsat-TM images to have a high resolution dataset. The human–computer interactive system is based on modified radial basis function for retrieval of satellite database images. We apply the backpropagation supervised artificial neural network classifier for both the multi and single resolution datasets. The results show significant improved land use/cover classification accuracy for the multi-resolution approach compared with those from single-resolution approach.

  9. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  10. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  11. Example of material accounting and verification of reprocessing input

    International Nuclear Information System (INIS)

    Koch, L.; Schoof, S.

    1981-01-01

    An example is described in this paper of material accounting at the reprocessing input point. Knowledge of the fuel history and chemical analyses of the spent fuel permitted concepts to be tested which have been developed for the determination of the input by the operator and for its verification by nuclear material safeguards with the intention of detecting a protracted as well as an abrupt diversion. Accuracies obtained for a material balance of a PWR fuel reprocessing campaign are given. 6 refs

  12. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  13. High accuracy satellite drag model (HASDM)

    Science.gov (United States)

    Storz, Mark F.; Bowman, Bruce R.; Branson, Major James I.; Casali, Stephen J.; Tobiska, W. Kent

    The dominant error source in force models used to predict low-perigee satellite trajectories is atmospheric drag. Errors in operational thermospheric density models cause significant errors in predicted satellite positions, since these models do not account for dynamic changes in atmospheric drag for orbit predictions. The Air Force Space Battlelab's High Accuracy Satellite Drag Model (HASDM) estimates and predicts (out three days) a dynamically varying global density field. HASDM includes the Dynamic Calibration Atmosphere (DCA) algorithm that solves for the phases and amplitudes of the diurnal and semidiurnal variations of thermospheric density near real-time from the observed drag effects on a set of Low Earth Orbit (LEO) calibration satellites. The density correction is expressed as a function of latitude, local solar time and altitude. In HASDM, a time series prediction filter relates the extreme ultraviolet (EUV) energy index E10.7 and the geomagnetic storm index ap, to the DCA density correction parameters. The E10.7 index is generated by the SOLAR2000 model, the first full spectrum model of solar irradiance. The estimated and predicted density fields will be used operationally to significantly improve the accuracy of predicted trajectories for all low-perigee satellites.

  14. Verification of possible asymmetry of polarization of thermal neutrons reflected by a mirror

    International Nuclear Information System (INIS)

    Okorokov, A.I.; Runov, V.V.; Gukasov, A.G.; Shchebetov, A.F.

    1976-01-01

    Experiments with a polarizing neutron guide do not confirm the neutron polarization asymmetry observed previously by Berndorfer for neutrons traversing a polarizing neutron guide. In connection with the spin-orbit effects a verification is carried out on single reflection of neutrons by magnetic or nonmagnetic mirrors. With an accuracy of 10 -4 -10 -3 no polarization asymmetry is observed

  15. A New Three-Dimensional High-Accuracy Automatic Alignment System For Single-Mode Fibers

    Science.gov (United States)

    Yun-jiang, Rao; Shang-lian, Huang; Ping, Li; Yu-mei, Wen; Jun, Tang

    1990-02-01

    In order to achieve the low-loss splices of single-mode fibers, a new three-dimension high-accuracy automatic alignment system for single -mode fibers has been developed, which includes a new-type three-dimension high-resolution microdisplacement servo stage driven by piezoelectric elements, a new high-accuracy measurement system for the misalignment error of the fiber core-axis, and a special single chip microcomputer processing system. The experimental results show that alignment accuracy of ±0.1 pin with a movable stroke of -±20μm has been obtained. This new system has more advantages than that reported.

  16. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  17. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    Energy Technology Data Exchange (ETDEWEB)

    Ashcraft, C. Chace [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Niederhaus, John Henry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, Allen C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-29

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specialized approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.

  18. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  19. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  20. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification

    Science.gov (United States)

    Palmer, Antony L.; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H.

    2015-11-01

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200-2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison.

  1. Evaluation of Gafchromic EBT-XD film, with comparison to EBT3 film, and application in high dose radiotherapy verification

    International Nuclear Information System (INIS)

    Palmer, Antony L; Dimitriadis, Alexis; Nisbet, Andrew; Clark, Catharine H

    2015-01-01

    There is renewed interest in film dosimetry for the verification of dose delivery of complex treatments, particularly small fields, compared to treatment planning system calculations. A new radiochromic film, Gafchromic EBT-XD, is available for high-dose treatment verification and we present the first published evaluation of its use. We evaluate the new film for MV photon dosimetry, including calibration curves, performance with single- and triple-channel dosimetry, and comparison to existing EBT3 film. In the verification of a typical 25 Gy stereotactic radiotherapy (SRS) treatment, compared to TPS planned dose distribution, excellent agreement was seen with EBT-XD using triple-channel dosimetry, in isodose overlay, maximum 1.0 mm difference over 200–2400 cGy, and gamma evaluation, mean passing rate 97% at 3% locally-normalised, 1.5 mm criteria. In comparison to EBT3, EBT-XD gave improved evaluation results for the SRS-plan, had improved calibration curve gradients at high doses, and had reduced lateral scanner effect. The dimensions of the two films are identical. The optical density of EBT-XD is lower than EBT3 for the same dose. The effective atomic number for both may be considered water-equivalent in MV radiotherapy. We have validated the use of EBT-XD for high-dose, small-field radiotherapy, for routine QC and a forthcoming multi-centre SRS dosimetry intercomparison. (paper)

  2. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  3. Adaptive sensor-based ultra-high accuracy solar concentrator tracker

    Science.gov (United States)

    Brinkley, Jordyn; Hassanzadeh, Ali

    2017-09-01

    Conventional solar trackers use information of the sun's position, either by direct sensing or by GPS. Our method uses the shading of the receiver. This, coupled with nonimaging optics design allows us to achieve ultra-high concentration. Incorporating a sensor based shadow tracking method with a two stage concentration solar hybrid parabolic trough allows the system to maintain high concentration with acute accuracy.

  4. High accuracy digital aging monitor based on PLL-VCO circuit

    International Nuclear Information System (INIS)

    Zhang Yuejun; Jiang Zhidi; Wang Pengjun; Zhang Xuelong

    2015-01-01

    As the manufacturing process is scaled down to the nanoscale, the aging phenomenon significantly affects the reliability and lifetime of integrated circuits. Consequently, the precise measurement of digital CMOS aging is a key aspect of nanoscale aging tolerant circuit design. This paper proposes a high accuracy digital aging monitor using phase-locked loop and voltage-controlled oscillator (PLL-VCO) circuit. The proposed monitor eliminates the circuit self-aging effect for the characteristic of PLL, whose frequency has no relationship with circuit aging phenomenon. The PLL-VCO monitor is implemented in TSMC low power 65 nm CMOS technology, and its area occupies 303.28 × 298.94 μm 2 . After accelerating aging tests, the experimental results show that PLL-VCO monitor improves accuracy about high temperature by 2.4% and high voltage by 18.7%. (semiconductor integrated circuits)

  5. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  6. WE-EF-303-06: Feasibility of PET Image-Based On-Line Proton Beam-Range Verification with Simulated Uniform Phantom and Human Brain Studies

    International Nuclear Information System (INIS)

    Lou, K; Sun, X; Zhu, X; Grosshans, D; Clark, J; Shao, Y

    2015-01-01

    Purpose: To study the feasibility of clinical on-line proton beam range verification with PET imaging Methods: We simulated a 179.2-MeV proton beam with 5-mm diameter irradiating a PMMA phantom of human brain size, which was then imaged by a brain PET with 300*300*100-mm 3 FOV and different system sensitivities and spatial resolutions. We calculated the mean and standard deviation of positron activity range (AR) from reconstructed PET images, with respect to different data acquisition times (from 5 sec to 300 sec with 5-sec step). We also developed a technique, “Smoothed Maximum Value (SMV)”, to improve AR measurement under a given dose. Furthermore, we simulated a human brain irradiated by a 110-MeV proton beam of 50-mm diameter with 0.3-Gy dose at Bragg peak and imaged by the above PET system with 40% system sensitivity at the center of FOV and 1.7-mm spatial resolution. Results: MC Simulations on the PMMA phantom showed that, regardless of PET system sensitivities and spatial resolutions, the accuracy and precision of AR were proportional to the reciprocal of the square root of image count if image smoothing was not applied. With image smoothing or SMV method, the accuracy and precision could be substantially improved. For a cylindrical PMMA phantom (200 mm diameter and 290 mm long), the accuracy and precision of AR measurement could reach 1.0 and 1.7 mm, with 100-sec data acquired by the brain PET. The study with a human brain showed it was feasible to achieve sub-millimeter accuracy and precision of AR measurement with acquisition time within 60 sec. Conclusion: This study established the relationship between count statistics and the accuracy and precision of activity-range verification. It showed the feasibility of clinical on-line BR verification with high-performance PET systems and improved AR measurement techniques. Cancer Prevention and Research Institute of Texas grant RP120326, NIH grant R21CA187717, The Cancer Center Support (Core) Grant CA016672

  7. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  8. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  9. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  10. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  11. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  12. A proposal for limited criminal liability in high-accuracy endoscopic sinus surgery.

    Science.gov (United States)

    Voultsos, P; Casini, M; Ricci, G; Tambone, V; Midolo, E; Spagnolo, A G

    2017-02-01

    The aim of the present study is to propose legal reform limiting surgeons' criminal liability in high-accuracy and high-risk surgery such as endoscopic sinus surgery (ESS). The study includes a review of the medical literature, focusing on identifying and examining reasons why ESS carries a very high risk of serious complications related to inaccurate surgical manoeuvers and reviewing British and Italian legal theory and case-law on medical negligence, especially with regard to Italian Law 189/2012 (so called "Balduzzi" Law). It was found that serious complications due to inaccurate surgical manoeuvers may occur in ESS regardless of the skill, experience and prudence/diligence of the surgeon. Subjectivity should be essential to medical negligence, especially regarding high-accuracy surgery. Italian Law 189/2012 represents a good basis for the limitation of criminal liability resulting from inaccurate manoeuvres in high-accuracy surgery such as ESS. It is concluded that ESS surgeons should be relieved of criminal liability in cases of simple/ordinary negligence where guidelines have been observed. © Copyright by Società Italiana di Otorinolaringologia e Chirurgia Cervico-Facciale, Rome, Italy.

  13. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    Science.gov (United States)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  14. High current high accuracy IGBT pulse generator

    International Nuclear Information System (INIS)

    Nesterov, V.V.; Donaldson, A.R.

    1995-05-01

    A solid state pulse generator capable of delivering high current triangular or trapezoidal pulses into an inductive load has been developed at SLAC. Energy stored in a capacitor bank of the pulse generator is switched to the load through a pair of insulated gate bipolar transistors (IGBT). The circuit can then recover the remaining energy and transfer it back to the capacitor bank without reversing the capacitor voltage. A third IGBT device is employed to control the initial charge to the capacitor bank, a command charging technique, and to compensate for pulse to pulse power losses. The rack mounted pulse generator contains a 525 μF capacitor bank. It can deliver 500 A at 900V into inductive loads up to 3 mH. The current amplitude and discharge time are controlled to 0.02% accuracy by a precision controller through the SLAC central computer system. This pulse generator drives a series pair of extraction dipoles

  15. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  16. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  17. High-accuracy determination for optical indicatrix rotation in ferroelectric DTGS

    OpenAIRE

    O.S.Kushnir; O.A.Bevz; O.G.Vlokh

    2000-01-01

    Optical indicatrix rotation in deuterated ferroelectric triglycine sulphate is studied with the high-accuracy null-polarimetric technique. The behaviour of the effect in ferroelectric phase is referred to quadratic spontaneous electrooptics.

  18. Achieving High Accuracy in Calculations of NMR Parameters

    DEFF Research Database (Denmark)

    Faber, Rasmus

    quantum chemical methods have been developed, the calculation of NMR parameters with quantitative accuracy is far from trivial. In this thesis I address some of the issues that makes accurate calculation of NMR parameters so challenging, with the main focus on SSCCs. High accuracy quantum chemical......, but no programs were available to perform such calculations. As part of this thesis the CFOUR program has therefore been extended to allow the calculation of SSCCs using the CC3 method. CC3 calculations of SSCCs have then been performed for several molecules, including some difficult cases. These results show...... vibrations must be included. The calculation of vibrational corrections to NMR parameters has been reviewed as part of this thesis. A study of the basis set convergence of vibrational corrections to nuclear shielding constants has also been performed. The basis set error in vibrational correction...

  19. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  20. PET/CT imaging for treatment verification after proton therapy: a study with plastic phantoms and metallic implants.

    Science.gov (United States)

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B; Bonab, Ali A; Alpert, Nathaniel M; Lohmann, Kevin; Bortfeld, Thomas

    2007-02-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of

  1. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  2. Standard artifact for the geometric verification of terrestrial laser scanning systems

    Science.gov (United States)

    González-Jorge, H.; Riveiro, B.; Armesto, J.; Arias, P.

    2011-10-01

    Terrestrial laser scanners are geodetic instruments with applications in areas such as architecture, civil engineering or environment. Although it is common to receive the technical specifications of the systems from their manufacturers, there are not any solutions for data verification in the market available for the users. This work proposes a standard artifact and a methodology to perform, in a simple way, the metrology verification of laser scanners. The artifact is manufactured using aluminium and delrin, materials that make the artifact robust and portable. The system consists of a set of five spheres situated at equal distances to one another, and a set of seven cubes of different sizes. A coordinate measuring machine with sub-millimetre precision is used for calibration purposes under controlled environmental conditions. After its calibration, the artifact can be used for the verification of metrology specifications given by manufacturers of laser scanners. The elements of the artifact are destinated to test different metrological characteristics, such as accuracy, precision and resolution. The distance between centres of the spheres is used to obtain the accuracy data, the standard deviation of the top face of the largest cube is used to establish the precision (repeatability) and the error in the measurement of the cubes provides the resolution value in axes X, Y and Z. Methodology for the evaluation is mainly supported by least squares fitting algorithms developed using Matlab programming. The artifact and methodology proposed were tested using a terrestrial laser scanner Riegl LMSZ-390i at three different ranges (10, 30 and 50 m) and four stepwidths (0.002°, 0.005°, 0.010° and 0.020°), both for horizontal and vertical displacements. Results obtained are in agreement with the accuracy and precision data given by the manufacturer, 6 and 4 mm, respectively. On the other hand, important influences between resolution and range and between resolution and

  3. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  4. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  5. SU-E-J-34: Setup Accuracy in Spine SBRT Using CBCT 6D Image Guidance in Comparison with 6D ExacTrac

    Energy Technology Data Exchange (ETDEWEB)

    Han, Z; Yip, S; Lewis, J; Mannarino, E; Friesen, S; Wagar, M; Hacker, F [Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA (United States)

    2015-06-15

    Purpose Volumetric information of the spine captured on CBCT can potentially improve the accuracy in spine SBRT setup that has been commonly performed through 2D radiographs. This work evaluates the setup accuracy in spine SBRT using 6D CBCT image guidance that recently became available on Varian systems. Methods ExacTrac radiographs have been commonly used for Spine SBRT setup. The setup process involves first positioning patients with lasers followed by localization imaging, registration, and repositioning. Verification images are then taken providing the residual errors (ExacTracRE) before beam on. CBCT verification is also acquired in our institute. The availability of both ExacTrac and CBCT verifications allows a comparison study. 41 verification CBCT of 16 patients were retrospectively registered with the planning CT enabling 6D corrections, giving CBCT residual errors (CBCTRE) which were compared with ExacTracRE. Results The RMS discrepancies between CBCTRE and ExacTracRE are 1.70mm, 1.66mm, 1.56mm in vertical, longitudinal and lateral directions and 0.27°, 0.49°, 0.35° in yaw, roll and pitch respectively. The corresponding mean discrepancies (and standard deviation) are 0.62mm (1.60mm), 0.00mm (1.68mm), −0.80mm (1.36mm) and 0.05° (0.58°), 0.11° (0.48°), −0.16° (0.32°). Of the 41 CBCT, 17 had high-Z surgical implants. No significant difference in ExacTrac-to-CBCT discrepancy was observed between patients with and without the implants. Conclusion Multiple factors can contribute to the discrepancies between CBCT and ExacTrac: 1) the imaging iso-centers of the two systems, while calibrated to coincide, can be different; 2) the ROI used for registration can be different especially if ribs were included in ExacTrac images; 3) small patient motion can occur between the two verification image acquisitions; 4) the algorithms can be different between CBCT (volumetric) and ExacTrac (radiographic) registrations.

  6. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  7. High-Accuracy Spherical Near-Field Measurements for Satellite Antenna Testing

    DEFF Research Database (Denmark)

    Breinbjerg, Olav

    2017-01-01

    The spherical near-field antenna measurement technique is unique in combining several distinct advantages and it generally constitutes the most accurate technique for experimental characterization of radiation from antennas. From the outset in 1970, spherical near-field antenna measurements have...... matured into a well-established technique that is widely used for testing antennas for many wireless applications. In particular, for high-accuracy applications, such as remote sensing satellite missions in ESA's Earth Observation Programme with uncertainty requirements at the level of 0.05dB - 0.10d......B, the spherical near-field antenna measurement technique is generally superior. This paper addresses the means to achieving high measurement accuracy; these include the measurement technique per se, its implementation in terms of proper measurement procedures, the use of uncertainty estimates, as well as facility...

  8. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  9. High Accuracy Piezoelectric Kinemometer; Cinemometro piezoelectrico de alta exactitud (VUAE)

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez Martinez, F. J.; Frutos, J. de; Pastor, C.; Vazquez Rodriguez, M.

    2012-07-01

    We have developed a portable computerized and low consumption, our system is called High Accuracy Piezoelectric Kinemometer measurement, herein VUAE. By the high accuracy obtained by VUAE it make able to use the VUAE to obtain references measurements of system for measuring Speeds in Vehicles. Therefore VUAE could be used how reference equipment to estimate the error of installed kinemometers. The VUAE was created with n (n=2) pairs of ultrasonic transmitter-receiver, herein E-Rult. The transmitters used in the n couples E-Rult generate n ultrasonic barriers and receivers receive the echoes when the vehicle crosses the barriers. Digital processing of the echoes signals let us to obtain acceptable signals. Later, by mean of cross correlation technics is possible make a highly exact estimation of speed of the vehicle. The log of the moments of interception and the distance between each of the n ultrasounds allows for a highly exact estimation of speed of the vehicle. VUAE speed measurements were compared to a speed reference system based on piezoelectric cables. (Author) 11 refs.

  10. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Science.gov (United States)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET

  11. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Knopf, A; Paganetti, H; Cascio, E; Bortfeld, T [Department of Radiation Oncology, MGH and Harvard Medical School, Boston, MA 02114 (United States); Parodi, K [Heidelberg Ion Therapy Center, Heidelberg (Germany); Bonab, A [Department of Radiology, MGH and Harvard Medical School, Boston, MA 02114 (United States)

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6{sup 0} to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  12. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    Science.gov (United States)

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the

  13. Virtual reality verification of workplace design guidelines for the process plant control room

    International Nuclear Information System (INIS)

    Droeivoldsmo, Asgeir; Nystad, Espen; Helgar, Stein

    2001-02-01

    Early identification of potential human factors guideline-violations and corrective input into the design process is desired for efficient and cost-effective control room design. Virtual reality (VR) technology makes it possible to perform evaluation of the design of the control room at an early stage of the design process, but can we trust the results from such evaluations? This paper describes an experimental validation of a VR model against the real world in five different guideline verification tasks. Results indicate that guideline verification in the VR model can be done with satisfactory accuracy for a number of evaluations. However, some guideline categories require further development of measurement tools and use of a model with higher resolution than the model used in this study. (Author). 30 refs., 4 figs., 1 tab

  14. [A Quality Assurance (QA) System with a Web Camera for High-dose-rate Brachytherapy].

    Science.gov (United States)

    Hirose, Asako; Ueda, Yoshihiro; Oohira, Shingo; Isono, Masaru; Tsujii, Katsutomo; Inui, Shouki; Masaoka, Akira; Taniguchi, Makoto; Miyazaki, Masayoshi; Teshima, Teruki

    2016-03-01

    The quality assurance (QA) system that simultaneously quantifies the position and duration of an (192)Ir source (dwell position and time) was developed and the performance of this system was evaluated in high-dose-rate brachytherapy. This QA system has two functions to verify and quantify dwell position and time by using a web camera. The web camera records 30 images per second in a range from 1,425 mm to 1,505 mm. A user verifies the source position from the web camera at real time. The source position and duration were quantified with the movie using in-house software which was applied with a template-matching technique. This QA system allowed verification of the absolute position in real time and quantification of dwell position and time simultaneously. It was evident from the verification of the system that the mean of step size errors was 0.31±0.1 mm and that of dwell time errors 0.1±0.0 s. Absolute position errors can be determined with an accuracy of 1.0 mm at all dwell points in three step sizes and dwell time errors with an accuracy of 0.1% in more than 10.0 s of the planned time. This system is to provide quick verification and quantification of the dwell position and time with high accuracy at various dwell positions without depending on the step size.

  15. A quality assurance (QA) system with a web camera for high-dose-rate brachytherapy

    International Nuclear Information System (INIS)

    Hirose, Asako; Ueda, Yoshihiro; Ohira, Shingo

    2016-01-01

    The quality assurance (QA) system that simultaneously quantifies the position and duration of an 192 Ir source (dwell position and time) was developed and the performance of this system was evaluated in high-dose-rate brachytherapy. This QA system has two functions to verify and quantify dwell position and time by using a web camera. The web camera records 30 images per second in a range from 1,425 mm to 1,505 mm. A user verifies the source position from the web camera at real time. The source position and duration were quantified with the movie using in-house software which was applied with a template-matching technique. This QA system allowed verification of the absolute position in real time and quantification of dwell position and time simultaneously. It was evident from the verification of the system that the mean of step size errors was 0.3±0.1 mm and that of dwell time errors 0.1 ± 0.0 s. Absolute position errors can be determined with an accuracy of 1.0 mm at all dwell points in three step sizes and dwell time errors with an accuracy of 0.1% in more than 10.0 s of the planned time. This system is to provide quick verification and quantification of the dwell position and time with high accuracy at various dwell positions without depending on the step size. (author)

  16. MR image-guided portal verification for brain treatment field

    International Nuclear Information System (INIS)

    Yin Fangfang; Gao Qinghuai; Xie Huchen; Nelson, Diana F.; Yu Yan; Kwok, W. Edmund; Totterman, Saara; Schell, Michael C.; Rubin, Philip

    1998-01-01

    features in both DRR-MRI and portal image. Moreover, target volume could be accurately visualized in the DRR-MRI and mapped over to the corresponding portal image for treatment verification. The accuracy of DRR-MRI was also examined by comparing it to the corresponding simulation image. The matching results indicated that the maximum deviation of anatomical features was less than 2.5 mm. Conclusion: A method for MR image-guided portal verification of brain treatment field was developed. Although the radiographic appearance in the DRR-MRI is different from that in the portal image, DRR-MRI provides essential anatomical features (landmarks and target volume) as well as their relative locations to be used as references for computerized portal verification

  17. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  18. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  19. Production of plastic scintillation survey meter for clearance verification measurement

    International Nuclear Information System (INIS)

    Tachibana, Mitsuo; Shiraishi, Kunio; Ishigami, Tsutomu; Tomii, Hiroyuki

    2008-03-01

    In the Nuclear Science Research Institute, the decommissioning of various nuclear facilities is carried out according to the plan for meeting the midterm goal of the Japan Atomic Energy Agency (JAEA). An increase in the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas will be expected along with the dismantlement of nuclear facilities in the future. The radiation measurement for releasing controlled areas has been carried out in small-scale nuclear facilities including the JPDR (Japan Power Demonstration Reactor). However, the radiation measurement with an existing measuring device was difficult in effects of radiation from radioactive materials that remains in buried piping. On the other hand, there is no experience that the clearance verification measurement is executed in the JAEA. The generation of a large amount of clearance object will be expected along with the decommissioning of the nuclear facilities in the future. The plastic scintillation survey meter (hereafter, 'PL measuring device') was produced to apply to the clearance verification measurement and the radiation measurement for releasing controlled areas. The basic characteristic test and the actual test were confirmed using the PL measuring device. As a result of these tests, it was found that the evaluation value of radioactivity with the PL measuring device was accuracy equal with the existing measuring device. The PL measuring device has feature of the existing measuring device with a light weight and easy operability. The PL measuring device can correct the gamma ray too. The PL measuring device is effective to the clearance verification measurement of concrete on buildings and the radiation measurement for releasing controlled areas. (author)

  20. Integration of PET-CT and cone-beam CT for image-guided radiotherapy with high image quality and registration accuracy

    Science.gov (United States)

    Wu, T.-H.; Liang, C.-H.; Wu, J.-K.; Lien, C.-Y.; Yang, B.-H.; Huang, Y.-H.; Lee, J. J. S.

    2009-07-01

    Hybrid positron emission tomography-computed tomography (PET-CT) system enhances better differentiation of tissue uptake of 18F-fluorodeoxyglucose (18F-FDG) and provides much more diagnostic value in the non-small-cell lung cancer and nasopharyngeal carcinoma (NPC). In PET-CT, high quality CT images not only offer diagnostic value on anatomic delineation of the tissues but also shorten the acquisition time for attenuation correction (AC) compared with PET-alone imaging. The linear accelerators equipped with the X-ray cone-beam computed tomography (CBCT) imaging system for image-guided radiotherapy (IGRT) provides excellent verification on position setup error. The purposes of our study were to optimize the CT acquisition protocols of PET-CT and to integrate the PET-CT and CBCT for IGRT. The CT imaging parameters were modified in PET-CT for increasing the image quality in order to enhance the diagnostic value on tumour delineation. Reproducibility and registration accuracy via bone co-registration algorithm between the PET-CT and CBCT were evaluated by using a head phantom to simulate a head and neck treatment condition. Dose measurement in computed tomography dose index (CTDI) was also estimated. Optimization of the CT acquisition protocols of PET-CT was feasible in this study. Co-registration accuracy between CBCT and PET-CT on axial and helical modes was in the range of 1.06 to 2.08 and 0.99 to 2.05 mm, respectively. In our result, it revealed that the accuracy of the co-registration with CBCT on helical mode was more accurate than that on axial mode. Radiation doses in CTDI were 4.76 to 18.5 mGy and 4.83 to 18.79 mGy on axial and helical modes, respectively. Registration between PET-CT and CBCT is a state-of-the-art registration technology which could provide much information on diagnosis and accurate tumour contouring on radiotherapy while implementing radiotherapy procedures. This novelty technology of PET-CT and cone-beam CT integration for IGRT may have a

  1. Integration of PET-CT and cone-beam CT for image-guided radiotherapy with high image quality and registration accuracy

    International Nuclear Information System (INIS)

    Wu, T-H; Liang, C-H; Wu, J-K; Lien, C-Y; Yang, B-H; Lee, J J S; Huang, Y-H

    2009-01-01

    Hybrid positron emission tomography-computed tomography (PET-CT) system enhances better differentiation of tissue uptake of 18 F-fluorodeoxyglucose ( 18 F-FDG) and provides much more diagnostic value in the non-small-cell lung cancer and nasopharyngeal carcinoma (NPC). In PET-CT, high quality CT images not only offer diagnostic value on anatomic delineation of the tissues but also shorten the acquisition time for attenuation correction (AC) compared with PET-alone imaging. The linear accelerators equipped with the X-ray cone-beam computed tomography (CBCT) imaging system for image-guided radiotherapy (IGRT) provides excellent verification on position setup error. The purposes of our study were to optimize the CT acquisition protocols of PET-CT and to integrate the PET-CT and CBCT for IGRT. The CT imaging parameters were modified in PET-CT for increasing the image quality in order to enhance the diagnostic value on tumour delineation. Reproducibility and registration accuracy via bone co-registration algorithm between the PET-CT and CBCT were evaluated by using a head phantom to simulate a head and neck treatment condition. Dose measurement in computed tomography dose index (CTDI) was also estimated. Optimization of the CT acquisition protocols of PET-CT was feasible in this study. Co-registration accuracy between CBCT and PET-CT on axial and helical modes was in the range of 1.06 to 2.08 and 0.99 to 2.05 mm, respectively. In our result, it revealed that the accuracy of the co-registration with CBCT on helical mode was more accurate than that on axial mode. Radiation doses in CTDI were 4.76 to 18.5 mGy and 4.83 to 18.79 mGy on axial and helical modes, respectively. Registration between PET-CT and CBCT is a state-of-the-art registration technology which could provide much information on diagnosis and accurate tumour contouring on radiotherapy while implementing radiotherapy procedures. This novelty technology of PET-CT and cone-beam CT integration for IGRT may have a

  2. Why is a high accuracy needed in dosimetry

    International Nuclear Information System (INIS)

    Lanzl, L.H.

    1976-01-01

    Dose and exposure intercomparisons on a national or international basis have become an important component of quality assurance in the practice of good radiotherapy. A high degree of accuracy of γ and x radiation dosimetry is essential in our international society, where medical information is so readily exchanged and used. The value of accurate dosimetry lies mainly in the avoidance of complications in normal tissue and an optimal degree of tumor control

  3. Quantitative dosimetric verification of an IMRT planning and delivery system

    International Nuclear Information System (INIS)

    Low, D.A.; Mutic, S.; Dempsey, J.F.; Gerber, R.L.; Bosch, W.R.; Perez, C.A.; Purdy, J.A.

    1998-01-01

    Background and purpose: The accuracy of dose calculation and delivery of a commercial serial tomotherapy treatment planning and delivery system (Peacock, NOMOS Corporation) was experimentally determined. Materials and methods: External beam fluence distributions were optimized and delivered to test treatment plan target volumes, including three with cylindrical targets with diameters ranging from 2.0 to 6.2 cm and lengths of 0.9 through 4.8 cm, one using three cylindrical targets and two using C-shaped targets surrounding a critical structure, each with different dose distribution optimization criteria. Computer overlays of film-measured and calculated planar dose distributions were used to assess the dose calculation and delivery spatial accuracy. A 0.125 cm 3 ionization chamber was used to conduct absolute point dosimetry verification. Thermoluminescent dosimetry chips, a small-volume ionization chamber and radiochromic film were used as independent checks of the ion chamber measurements. Results: Spatial localization accuracy was found to be better than ±2.0 mm in the transverse axes (with one exception of 3.0 mm) and ±1.5 mm in the longitudinal axis. Dosimetric verification using single slice delivery versions of the plans showed that the relative dose distribution was accurate to ±2% within and outside the target volumes (in high dose and low dose gradient regions) with a mean and standard deviation for all points of -0.05% and 1.1%, respectively. The absolute dose per monitor unit was found to vary by ±3.5% of the mean value due to the lack of consideration for leakage radiation and the limited scattered radiation integration in the dose calculation algorithm. To deliver the prescribed dose, adjustment of the monitor units by the measured ratio would be required. Conclusions: The treatment planning and delivery system offered suitably accurate spatial registration and dose delivery of serial tomotherapy generated dose distributions. The quantitative dose

  4. Present scenery of cuban legislation in the field of legal verification of dosimetric instruments used in radiological protection

    International Nuclear Information System (INIS)

    Salas G, Walwyn; Morales Monzon, J.A.; Hernandez Blanche, E.

    2001-01-01

    The main objective of legal metrology is to ensure the public guaranty from the point of view of safety, and the suitable accuracy of the measurements that are made on health, environmental applications, and trade. The International Organization of Legal Metrology included the ionizing radiation field on those for which the use of the verified measuring instruments are suggested. . The paper presents the advances of Cuban legislation in this field, promoted by issue of the Decree-Law 183 of Metrology. As part of such advances, the Cuban standards for verification NC 44:1999 'X and Gamma Radiation Measuring Instruments. Verification methods' is discussed. This standard was elaborated in the Cuban Secondary Standard Dosimetry Laboratory, and it is based on the available relevant international standards. Results from verification service during the year 2000 are also provided.(author)

  5. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  6. Innovative Fiber-Optic Gyroscopes (FOGs) for High Accuracy Space Applications, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to develop a compact, highly innovative Inertial Reference/Measurement Unit (IRU/IMU) that pushes the state-of-the-art in high accuracy performance...

  7. High accuracy acoustic relative humidity measurement in duct flow with air.

    Science.gov (United States)

    van Schaik, Wilhelm; Grooten, Mart; Wernaart, Twan; van der Geld, Cees

    2010-01-01

    An acoustic relative humidity sensor for air-steam mixtures in duct flow is designed and tested. Theory, construction, calibration, considerations on dynamic response and results are presented. The measurement device is capable of measuring line averaged values of gas velocity, temperature and relative humidity (RH) instantaneously, by applying two ultrasonic transducers and an array of four temperature sensors. Measurement ranges are: gas velocity of 0-12 m/s with an error of ± 0.13 m/s, temperature 0-100 °C with an error of ± 0.07 °C and relative humidity 0-100% with accuracy better than 2 % RH above 50 °C. Main advantage over conventional humidity sensors is the high sensitivity at high RH at temperatures exceeding 50 °C, with accuracy increasing with increasing temperature. The sensors are non-intrusive and resist highly humid environments.

  8. Alien Registration Number Verification via the U.S. Citizenship and Immigration Service's Systematic Alien Verification for Entitlements System

    National Research Council Canada - National Science Library

    Ainslie, Frances M; Buck, Kelly R

    2008-01-01

    The purpose of this study was to evaluate the implications of conducting high-volume automated checks of the United States Citizenship and Immigration Services Systematic Allen Verification for Entitlements System (SAVE...

  9. A field study of the accuracy and reliability of a biometric iris recognition system.

    Science.gov (United States)

    Latman, Neal S; Herb, Emily

    2013-06-01

    The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  11. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  12. Electron ray tracing with high accuracy

    International Nuclear Information System (INIS)

    Saito, K.; Okubo, T.; Takamoto, K.; Uno, Y.; Kondo, M.

    1986-01-01

    An electron ray tracing program is developed to investigate the overall geometrical and chromatic aberrations in electron optical systems. The program also computes aberrations due to manufacturing errors in lenses and deflectors. Computation accuracy is improved by (1) calculating electrostatic and magnetic scalar potentials using the finite element method with third-order isoparametric elements, and (2) solving the modified ray equation which the aberrations satisfy. Computation accuracy of 4 nm is achieved for calculating optical properties of the system with an electrostatic lens

  13. High accuracy 3D electromagnetic finite element analysis

    International Nuclear Information System (INIS)

    Nelson, Eric M.

    1997-01-01

    A high accuracy 3D electromagnetic finite element field solver employing quadratic hexahedral elements and quadratic mixed-order one-form basis functions will be described. The solver is based on an object-oriented C++ class library. Test cases demonstrate that frequency errors less than 10 ppm can be achieved using modest workstations, and that the solutions have no contamination from spurious modes. The role of differential geometry and geometrical physics in finite element analysis will also be discussed

  14. High accuracy 3D electromagnetic finite element analysis

    International Nuclear Information System (INIS)

    Nelson, E.M.

    1996-01-01

    A high accuracy 3D electromagnetic finite element field solver employing quadratic hexahedral elements and quadratic mixed-order one-form basis functions will be described. The solver is based on an object-oriented C++ class library. Test cases demonstrate that frequency errors less than 10 ppm can be achieved using modest workstations, and that the solutions have no contamination from spurious modes. The role of differential geometry and geometrical physics in finite element analysis will also be discussed

  15. A High-Throughput, High-Accuracy System-Level Simulation Framework for System on Chips

    Directory of Open Access Journals (Sweden)

    Guanyi Sun

    2011-01-01

    Full Text Available Today's System-on-Chips (SoCs design is extremely challenging because it involves complicated design tradeoffs and heterogeneous design expertise. To explore the large solution space, system architects have to rely on system-level simulators to identify an optimized SoC architecture. In this paper, we propose a system-level simulation framework, System Performance Simulation Implementation Mechanism, or SPSIM. Based on SystemC TLM2.0, the framework consists of an executable SoC model, a simulation tool chain, and a modeling methodology. Compared with the large body of existing research in this area, this work is aimed at delivering a high simulation throughput and, at the same time, guaranteeing a high accuracy on real industrial applications. Integrating the leading TLM techniques, our simulator can attain a simulation speed that is not slower than that of the hardware execution by a factor of 35 on a set of real-world applications. SPSIM incorporates effective timing models, which can achieve a high accuracy after hardware-based calibration. Experimental results on a set of mobile applications proved that the difference between the simulated and measured results of timing performance is within 10%, which in the past can only be attained by cycle-accurate models.

  16. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  17. Analysis of an indirect neutron signature for enhanced UF_6 cylinder verification

    International Nuclear Information System (INIS)

    Kulisek, J.A.; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-01-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF_6) cylinders. The current method provides relatively low accuracy for the assay of "2"3"5U enrichment, especially for natural and depleted UF_6. Furthermore, the current method provides no capability to assay the absolute mass of "2"3"5U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from "2"3"5U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA_N_T). HEVA_N_T enables full-volume assay of UF_6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF_6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA_N_T in terms of the individual contributions to HEVA_N_T from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA_N_T signature to manipulation by the nearby placement of neutron-conversion materials.

  18. Accuracy of hiatal hernia detection with esophageal high-resolution manometry

    NARCIS (Netherlands)

    Weijenborg, P. W.; van Hoeij, F. B.; Smout, A. J. P. M.; Bredenoord, A. J.

    2015-01-01

    The diagnosis of a sliding hiatal hernia is classically made with endoscopy or barium esophagogram. Spatial separation of the lower esophageal sphincter (LES) and diaphragm, the hallmark of hiatal hernia, can also be observed on high-resolution manometry (HRM), but the diagnostic accuracy of this

  19. Verification of Bioanalytical Method for Quantification of Exogenous Insulin (Insulin Aspart) by the Analyser Advia Centaur® XP.

    Science.gov (United States)

    Mihailov, Rossen; Stoeva, Dilyana; Pencheva, Blagovesta; Pentchev, Eugeni

    2018-03-01

    In a number of cases the monitoring of patients with type I diabetes mellitus requires measurement of the exogenous insulin levels. For the purpose of a clinical investigation of the efficacy of a medical device for application of exogenous insulin aspart, a verification of the method for measurement of this synthetic analogue of the hormone was needed. The information in the available medical literature for the measurement of the different exogenous insulin analogs is insufficient. Thus, verification was required to be in compliance with the active standards in Republic of Bulgaria. A manufactured method developed for ADVIA Centaur XP Immunoassay, Siemens Healthcare, was used which we verified using standard solutions and a patient serum pool by adding the appropriate quantity exogenous insulin aspart. The method was verified in accordance with the bioanalytical method verification criteria and regulatory requirements for using a standard method: CLIA chemiluminescence immunoassay ADVIA Centaur® XP. The following parameters are determined and monitored: intra-day precision and accuracy, inter-day precision and accuracy, limit of detection and lower limit of quantification, linearity, analytical recovery. The routine application of the method for measurement of immunoreactive insulin using the analyzer ADVIA Centaur® XP is directed to the measurement of endogenous insulin. The method is applicable for measuring different types of exogenous insulin, including insulin aspart.

  20. High Accuracy Acoustic Relative Humidity Measurement inDuct Flow with Air

    Directory of Open Access Journals (Sweden)

    Cees van der Geld

    2010-08-01

    Full Text Available An acoustic relative humidity sensor for air-steam mixtures in duct flow is designed and tested. Theory, construction, calibration, considerations on dynamic response and results are presented. The measurement device is capable of measuring line averaged values of gas velocity, temperature and relative humidity (RH instantaneously, by applying two ultrasonic transducers and an array of four temperature sensors. Measurement ranges are: gas velocity of 0–12 m/s with an error of ±0.13 m/s, temperature 0–100 °C with an error of ±0.07 °C and relative humidity 0–100% with accuracy better than 2 % RH above 50 °C. Main advantage over conventional humidity sensors is the high sensitivity at high RH at temperatures exceeding 50 °C, with accuracy increasing with increasing temperature. The sensors are non-intrusive and resist highly humid environments.

  1. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  2. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    Science.gov (United States)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  3. The Fundamentals of the Air Sampler Calibration-Verification Process

    International Nuclear Information System (INIS)

    Gavila, F.M.

    2011-01-01

    The calibration of an air sampling instrument using a reference air flow calibrator requires attention to scientific detail in order to establish that the instrument's reported values are correctly stated and valid under the actual operating conditions of the air sampling instrument. The primary objective of an air flow calibration-verification is to ensure that the device under test (DUT) is within the manufacturer's stated accuracy range of temperature, pressure and humidity conditions under which the instrument was designed to operate. The DUT output values are compared to those obtained from a reference instrument (REF) measuring the sample physical parameter that the DUT is measuring. An accurate comparison of air flow rates or air volumes requires that the comparison of the DUT and REF values be made under the same temperature and pressure conditions. It is absolutely necessary that the REF be more accurate than the DUT; otherwise, it can not be considered a reference instrument. The REF should be at least twice as accurate and, if possible, it should be four times as accurate as the DUT. Upon confirmation that the DUT meets the manufacturer's accuracy criteria, the technician must place a calibration sticker or label indicating the date of calibration, the expiration date of the calibration and an authorized signature. If it is a limited-use instrument, the label should state the limited-use operating range. The serial number and model number of the instrument should also be shown on the calibration sticker. A specific calibration file for each instrument by serial number should be kept in the calibration laboratory file records. Instruments that display gas flow or gas volume values corrected to a reference temperature and pressure are very desirable. The ideal situation is when both the DUT and the REF output flow rate or volume values are at the same conditions of T and P. The calibration-verification is, then, a simple process. The credibility of an air

  4. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  5. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  6. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  7. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2009-01-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electrolyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  8. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  9. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  10. High accuracy 3D electromagnetic finite element analysis

    International Nuclear Information System (INIS)

    Nelson, E.M.

    1997-01-01

    A high accuracy 3D electromagnetic finite element field solver employing quadratic hexahedral elements and quadratic mixed-order one-form basis functions will be described. The solver is based on an object-oriented C++ class library. Test cases demonstrate that frequency errors less than 10 ppm can be achieved using modest workstations, and that the solutions have no contamination from spurious modes. The role of differential geometry and geometrical physics in finite element analysis will also be discussed. copyright 1997 American Institute of Physics

  11. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click.

    Science.gov (United States)

    Collins, Jennifer; Pecher, Diane; Zeelenberg, René; Coulson, Seana

    2011-01-01

    The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

  12. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  13. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  14. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  15. Age verification cards fail to fully prevent minors from accessing tobacco products.

    Science.gov (United States)

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  16. A New Approach for High Pressure Pixel Polar Distribution on Off-line Signature Verification

    Directory of Open Access Journals (Sweden)

    Jesús F. Vargas

    2010-06-01

    Full Text Available Features representing information of High Pressure Points froma static image of a handwritten signature are analyzed for an offline verification system. From grayscale images, a new approach for High Pressure threshold estimation is proposed. Two images, one containingthe High Pressure Points extracted and other with a binary version ofthe original signature, are transformed to polar coordinates where a pixel density ratio between them is calculated. Polar space had been divided into angular and radial segments, which permit a local analysis of the high pressure distribution. Finally two vectors containing the density distribution ratio are calculated for nearest and farthest points from geometric center of the original signature image. Experiments were carried out using a database containing signature from 160 individual. The robustness of the analyzed system for simple forgeries is tested out with Support Vector Machines models. For the sake of completeness, a comparison of the results obtained by the proposed approach with similar works published is presented.

  17. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  18. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  19. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  20. Automated synthesis and verification of configurable DRAM blocks for ASIC's

    Science.gov (United States)

    Pakkurti, M.; Eldin, A. G.; Kwatra, S. C.; Jamali, M.

    1993-01-01

    A highly flexible embedded DRAM compiler is developed which can generate DRAM blocks in the range of 256 bits to 256 Kbits. The compiler is capable of automatically verifying the functionality of the generated DRAM modules. The fully automated verification capability is a key feature that ensures the reliability of the generated blocks. The compiler's architecture, algorithms, verification techniques and the implementation methodology are presented.

  1. The effect of pattern overlap on the accuracy of high resolution electron backscatter diffraction measurements

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Vivian, E-mail: v.tong13@imperial.ac.uk [Department of Materials, Imperial College London, Prince Consort Road, London SW7 2AZ (United Kingdom); Jiang, Jun [Department of Materials, Imperial College London, Prince Consort Road, London SW7 2AZ (United Kingdom); Wilkinson, Angus J. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Britton, T. Ben [Department of Materials, Imperial College London, Prince Consort Road, London SW7 2AZ (United Kingdom)

    2015-08-15

    High resolution, cross-correlation-based, electron backscatter diffraction (EBSD) measures the variation of elastic strains and lattice rotations from a reference state. Regions near grain boundaries are often of interest but overlap of patterns from the two grains could reduce accuracy of the cross-correlation analysis. To explore this concern, patterns from the interior of two grains have been mixed to simulate the interaction volume crossing a grain boundary so that the effect on the accuracy of the cross correlation results can be tested. It was found that the accuracy of HR-EBSD strain measurements performed in a FEG-SEM on zirconium remains good until the incident beam is less than 18 nm from a grain boundary. A simulated microstructure was used to measure how often pattern overlap occurs at any given EBSD step size, and a simple relation was found linking the probability of overlap with step size. - Highlights: • Pattern overlap occurs at grain boundaries and reduces HR-EBSD accuracy. • A test is devised to measure the accuracy of HR-EBSD in the presence of overlap. • High pass filters can sometimes, but not generally, improve HR-EBSD measurements. • Accuracy of HR-EBSD remains high until the reference pattern intensity is <72%. • 9% of points near a grain boundary will have significant error for 200nm step size in Zircaloy-4.

  2. Read-only high accuracy volume holographic optical correlator

    Science.gov (United States)

    Zhao, Tian; Li, Jingming; Cao, Liangcai; He, Qingsheng; Jin, Guofan

    2011-10-01

    A read-only volume holographic correlator (VHC) is proposed. After the recording of all of the correlation database pages by angular multiplexing, a stand-alone read-only high accuracy VHC will be separated from the VHC recording facilities which include the high-power laser and the angular multiplexing system. The stand-alone VHC has its own low power readout laser and very compact and simple structure. Since there are two lasers that are employed for recording and readout, respectively, the optical alignment tolerance of the laser illumination on the SLM is very sensitive. The twodimensional angular tolerance is analyzed based on the theoretical model of the volume holographic correlator. The experimental demonstration of the proposed read-only VHC is introduced and discussed.

  3. Analysis of an indirect neutron signature for enhanced UF{sub 6} cylinder verification

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, J.A., E-mail: Jonathan.Kulisek@pnnl.gov; McDonald, B.S.; Smith, L.E.; Zalavadia, M.A.; Webster, J.B.

    2017-02-21

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF{sub 6}) cylinders. The current method provides relatively low accuracy for the assay of {sup 235}U enrichment, especially for natural and depleted UF{sub 6}. Furthermore, the current method provides no capability to assay the absolute mass of {sup 235}U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from {sup 235}U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVA{sub NT}). HEVA{sub NT} enables full-volume assay of UF{sub 6} cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF{sub 6}. In this work, Monte Carlo modeling is used as the basis for characterizing HEVA{sub NT} in terms of the individual contributions to HEVA{sub NT} from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVA{sub NT} signature to manipulation by the nearby placement of neutron-conversion materials.

  4. Verification of aspheric contact lens back surfaces.

    Science.gov (United States)

    Dietze, Holger H; Cox, Michael J; Douthwaite, William A

    2003-08-01

    To suggest a tolerance level for the degree of asphericity of aspheric rigid gas-permeable contact lenses and to find a simple method for its verification. Using existing tolerances for the vertex radius, tolerance limits for eccentricity and p values and were calculated. A keratometer-based method and a method based on sag measurements were used to measure the vertex radius and eccentricity of eight concave progressively aspheric surfaces and six concave ellipsoidal surfaces. The results were compared with a gold standard measurement made using a high-precision mechanical instrument (Form Talysurf). The suggested tolerance for eccentricity and p value and is +/-0.05. The keratometer method was very accurate and precise at measuring the vertex radius (mean deviation +/- SD from Talysurf results, -0.002 +/- 0.008 mm). The keratometer was more precise than and similar in accuracy to the sag method for measurement of asphericity (mean deviation of keratometer method results from Talysurf results, 0.017 +/- 0.018; mean deviation of sag method results from Talysurf results using five semichords, -0.016 +/- 0.032). Neither method was precise enough to verify the asphericity within the suggested tolerance. The keratometer can be efficiently used to verify the back vertex radius within its International Organization for Standardization tolerance and the back surface asphericity within an eccentricity/p value tolerance of +/-0.1. The method is poor for progressive aspheres with large edge blending zones. Deriving the eccentricity from sag measurements is a potential alternative if the mathematical description of the surface is known. The limiting factor of this method is the accuracy and precision of individual sag measurements.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  6. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  7. Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification

    Science.gov (United States)

    Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.

    2014-07-01

    In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.

  8. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    Directory of Open Access Journals (Sweden)

    Zheng You

    2013-04-01

    Full Text Available The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  9. Optical system error analysis and calibration method of high-accuracy star trackers.

    Science.gov (United States)

    Sun, Ting; Xing, Fei; You, Zheng

    2013-04-08

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  10. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    Science.gov (United States)

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  11. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lou, K [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Rice University, Houston, TX (United States); Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Clark, J [Rice University, Houston, TX (United States)

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  12. Innovative Fiber-Optic Gyroscopes (FOGs) for High Accuracy Space Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's future science and exploratory missions will require much lighter, smaller, and longer life rate sensors that can provide high accuracy navigational...

  13. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  14. E-Visas Verification Schemes Based on Public-Key Infrastructure and Identity Based Encryption

    OpenAIRE

    Najlaa A. Abuadhmah; Muawya Naser; Azman Samsudin

    2010-01-01

    Problem statement: Visa is a very important travelling document, which is an essential need at the point of entry of any country we are visiting. However an important document such as visa is still handled manually which affects the accuracy and efficiency of processing the visa. Work on e-visa is almost unexplored. Approach: This study provided a detailed description of a newly proposed e-visa verification system prototyped based on RFID technology. The core technology of the proposed e-visa...

  15. Geometric accuracy of field alignment in fractionated stereotactic conformal radiotherapy of brain tumors

    International Nuclear Information System (INIS)

    Kortmann, Rolf D.; Becker, Gerd; Perelmouter, Jury; Buchgeister, Markus; Meisner, Christoph; Bamberg, Michael

    1999-01-01

    Purpose: To assess the accuracy of field alignment in patients undergoing three-dimensional (3D) conformal radiotherapy of brain tumors, and to evaluate the impact on the definition of planning target volume and control procedures. Methods and Materials: Geometric accuracy was analyzed in 20 patients undergoing fractionated stereotactic conformal radiotherapy for brain tumors. Rigid head fixation was achieved by using cast material. Transfer of stereotactic coordinates was performed by an external positioning device. The accuracy during treatment planning was quantitatively assessed by using repeated computed tomography (CT) examinations in treatment position (reproducibility of isocenter). Linear discrepancies were measured between treatment plan and CT examination. In addition, for each patient, a series of 20 verifications were taken in orthogonal projections. Linear discrepancies were measured between first and all subsequent verifications (accuracy during treatment delivery). Results: For the total group of patients, the distribution of deviations during treatment setup showed mean values between -0.3-1.2 mm, with standard deviations (SD) of 1.3-2.0 mm. During treatment delivery, the distribution of deviations revealed mean values between 0.7-0.8 mm, with SDs of 0.5-0.6 mm, respectively. For all patients, deviations for the transition to the treatment machine were similar to deviations during subsequent treatment delivery, with 95% of all absolute deviations between less than 2.8 and 4.6 mm. Conclusion: Random fluctuations of field displacements during treatment planning and delivery prevail. Therefore, our quantitative data should be considered when prescribing the safety margins of the planning target volume. Repeated CT examination are useful to detect operator errors and large random or systematic deviations before start of treatment. Control procedures during treatment delivery appear to be of limited importance. In addition, our findings should help to

  16. Verification of spectrophotometric method for nitrate analysis in water samples

    Science.gov (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu

    2017-12-01

    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  17. MUSCLE: multiple sequence alignment with high accuracy and high throughput.

    Science.gov (United States)

    Edgar, Robert C

    2004-01-01

    We describe MUSCLE, a new computer program for creating multiple alignments of protein sequences. Elements of the algorithm include fast distance estimation using kmer counting, progressive alignment using a new profile function we call the log-expectation score, and refinement using tree-dependent restricted partitioning. The speed and accuracy of MUSCLE are compared with T-Coffee, MAFFT and CLUSTALW on four test sets of reference alignments: BAliBASE, SABmark, SMART and a new benchmark, PREFAB. MUSCLE achieves the highest, or joint highest, rank in accuracy on each of these sets. Without refinement, MUSCLE achieves average accuracy statistically indistinguishable from T-Coffee and MAFFT, and is the fastest of the tested methods for large numbers of sequences, aligning 5000 sequences of average length 350 in 7 min on a current desktop computer. The MUSCLE program, source code and PREFAB test data are freely available at http://www.drive5. com/muscle.

  18. Remote Sensing Based Two-Stage Sampling for Accuracy Assessment and Area Estimation of Land Cover Changes

    Directory of Open Access Journals (Sweden)

    Heinz Gallaun

    2015-09-01

    Full Text Available Land cover change processes are accelerating at the regional to global level. The remote sensing community has developed reliable and robust methods for wall-to-wall mapping of land cover changes; however, land cover changes often occur at rates below the mapping errors. In the current publication, we propose a cost-effective approach to complement wall-to-wall land cover change maps with a sampling approach, which is used for accuracy assessment and accurate estimation of areas undergoing land cover changes, including provision of confidence intervals. We propose a two-stage sampling approach in order to keep accuracy, efficiency, and effort of the estimations in balance. Stratification is applied in both stages in order to gain control over the sample size allocated to rare land cover change classes on the one hand and the cost constraints for very high resolution reference imagery on the other. Bootstrapping is used to complement the accuracy measures and the area estimates with confidence intervals. The area estimates and verification estimations rely on a high quality visual interpretation of the sampling units based on time series of satellite imagery. To demonstrate the cost-effective operational applicability of the approach we applied it for assessment of deforestation in an area characterized by frequent cloud cover and very low change rate in the Republic of Congo, which makes accurate deforestation monitoring particularly challenging.

  19. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  20. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    Science.gov (United States)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  1. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  2. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  3. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  4. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    Science.gov (United States)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  5. Legal verification of the dosimetric instrumentation using for radiation protection in Cuba

    International Nuclear Information System (INIS)

    Walwyn, A.; Morales, J.A.

    1999-01-01

    By April of 1998 the Decree law 183 of Metrology was published at the Gaceta Oficial de la Republica de Cuba. It establishes the principles and general regulations for the organisation and juridical system of the metrological activity in Cuba. In the radiation protection field this legislation promote the establishment of a verification service of radiation measuring instruments used in the practices with radiation sources in the country. The limitations of old Cuban standards of verification related to dosimetric quantities and to the types of instruments for those which these standards are applicable; and in addition, the publication of new international standards that includes the operational quantities used for the measurement of instruments, led to the elaboration of the X and Gamma Radiation Meters Used in Radiation Protection standard. The requirements of metrological aptitude are taken from some test procedures described in the International Electrotechnical Commission (IEC) standards on photon monitoring equipment. The Secondary Standard Dosimetry Laboratory of the Centre for Radiation Protection and Higiene will start the verification service of Radiation Protection instruments. The beginning of the service is an essential element in the improvement of the accuracy of ionisation radiation metrology in Cuba, and have an evident impact in the protection of the occupationally exposed workers, because having the instruments in good technical condition became a legal exigency to the users of ionisation radiation

  6. Dosimetric pre-treatment verification of IMRT using an EPID; clinical experience

    International Nuclear Information System (INIS)

    Zijtveld, Mathilda van; Dirkx, Maarten L.P.; Boer, Hans C.J. de; Heijmen, Ben J.M.

    2006-01-01

    Background and purpose: In our clinic a QA program for IMRT verification, fully based on dosimetric measurements with electronic portal imaging devices (EPID), has been running for over 3 years. The program includes a pre-treatment dosimetric check of all IMRT fields. During a complete treatment simulation at the linac, a portal dose image (PDI) is acquired with the EPID for each patient field and compared with a predicted PDI. In this paper, the results of this pre-treatment procedure are analysed, and intercepted errors are reported. An automated image analysis procedure is proposed to limit the number of fields that need human intervention in PDI comparison. Materials and methods: Most of our analyses are performed using the γ index with 3% local dose difference and 3 mm distance to agreement as reference values. Scalar parameters are derived from the γ values to summarize the agreement between measured and predicted 2D PDIs. Areas with all pixels having γ values larger than one are evaluated, making decisions based on clinically relevant criteria more straightforward. Results: In 270 patients, the pre-treatment checks revealed four clinically relevant errors. Calculation of statistics for a group of 75 patients showed that the patient-averaged mean γ value inside the field was 0.43 ± 0.13 (1 SD) and only 6.1 ± 6.8% of pixels had a γ value larger than one. With the proposed automated image analysis scheme, visual inspection of images can be avoided in 2/3 of the cases. Conclusion: EPIDs may be used for high accuracy and high resolution routine verification of IMRT fields to intercept clinically relevant dosimetric errors prior to the start of treatment. For the majority of fields, PDI comparison can fully rely on an automated procedure, avoiding excessive workload

  7. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  8. High accuracy interface characterization of three phase material systems in three dimensions

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Quantification of interface properties such as two phase boundary area and triple phase boundary length is important in the characterization ofmanymaterial microstructures, in particular for solid oxide fuel cell electrodes. Three-dimensional images of these microstructures can be obtained...... by tomography schemes such as focused ion beam serial sectioning or micro-computed tomography. We present a high accuracy method of calculating two phase surface areas and triple phase length of triple phase systems from subvoxel accuracy segmentations of constituent phases. The method performs a three phase...... polygonization of the interface boundaries which results in a non-manifold mesh of connected faces. We show how the triple phase boundaries can be extracted as connected curve loops without branches. The accuracy of the method is analyzed by calculations on geometrical primitives...

  9. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  10. DeepMitosis: Mitosis detection via deep detection, verification and segmentation networks.

    Science.gov (United States)

    Li, Chao; Wang, Xinggang; Liu, Wenyu; Latecki, Longin Jan

    2018-04-01

    Mitotic count is a critical predictor of tumor aggressiveness in the breast cancer diagnosis. Nowadays mitosis counting is mainly performed by pathologists manually, which is extremely arduous and time-consuming. In this paper, we propose an accurate method for detecting the mitotic cells from histopathological slides using a novel multi-stage deep learning framework. Our method consists of a deep segmentation network for generating mitosis region when only a weak label is given (i.e., only the centroid pixel of mitosis is annotated), an elaborately designed deep detection network for localizing mitosis by using contextual region information, and a deep verification network for improving detection accuracy by removing false positives. We validate the proposed deep learning method on two widely used Mitosis Detection in Breast Cancer Histological Images (MITOSIS) datasets. Experimental results show that we can achieve the highest F-score on the MITOSIS dataset from ICPR 2012 grand challenge merely using the deep detection network. For the ICPR 2014 MITOSIS dataset that only provides the centroid location of mitosis, we employ the segmentation model to estimate the bounding box annotation for training the deep detection network. We also apply the verification model to eliminate some false positives produced from the detection model. By fusing scores of the detection and verification models, we achieve the state-of-the-art results. Moreover, our method is very fast with GPU computing, which makes it feasible for clinical practice. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  12. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  13. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  14. A study into the review and verification of breast images treated with isocentric technique

    International Nuclear Information System (INIS)

    Mitchell, Fiona

    2007-01-01

    In radiation therapy practice, portal imaging is a common occurrence. Radiation Oncologists want to be able to view the actual treatment port and compare it to the simulated view for quality assurance. Historically, this has been the domain of oncologists only but with the changes in imaging technology, this area of practice is now more commonly shared with the radiation therapists. Purpose: The primary aim of this study was to compare the Radiation Therapists' result versus the Radiation Oncologists' practice of review and verification of electronic portal imaging in the treatment of breast cancer. A secondary result was enhancement of electronic portal imaging use. Methods: The study was divided into two parts. Part 1 reviewed imaging of tangential breast treatment and part 2 reviewed mono-isocentric four-field breast technique. The review and verification of the images were conducted by the Radiation Therapists and Radiation Oncologists and their subsequent results were compared. Results: Overall the Radiation Oncologist agreed with 96.9% of the images approved by the Radiation Therapists. This makes for a rejection rate of 3.1%. In general, Radiation Therapists adhered to the guidelines more closely than the Radiation Oncologist hence the rejection rate of Radiation Therapists was greater than the Radiation Oncologist by 7.0%. Conclusions: The practice of electronic portal imaging review and verification in the treatment of breast cancer can be streamlined and achieved more efficiently. The Radiation Therapists consistently demonstrated their ability to review and verify the portal images, as equivalent to the Radiation Oncologist. Given the high standard of accuracy demonstrated the process of portal image review should be transferred to the Radiation Therapist. This transfer leads to reduction in duplicity of task, an increase in the use of technology, an improvement in efficiencies, and an increase in the quality of care, which will potentially lead to more

  15. High-accuracy dosimetry study for intensity-modulated radiation therapy(IMRT) commissioning

    International Nuclear Information System (INIS)

    Jeong, Hae Sun

    2010-02-01

    Intensity-modulated radiation therapy (IMRT), an advanced modality of high-precision radiotherapy, allows for an increase in dose to the tumor volume without increasing the dose to nearby critical organs. In order to successfully achieve the treatment, intensive dosimetry with accurate dose verification is necessary. A dosimetry for IMRT, however, is a challenging task due to dosimetric ally unfavorable phenomena such as dramatic changes of the dose at the field boundaries, dis-equilibrium of the electrons, non-uniformity between the detector and the phantom materials, and distortion of scanner-read doses. In the present study, therefore, the LEGO-type multi-purpose dosimetry phantom was developed and used for the studies on dose measurements and correction. Phantom materials for muscle, fat, bone, and lung tissue were selected after considering mass density, atomic composition, effective atomic number, and photon interaction coefficients. The phantom also includes dosimeter holders for several different types of detectors including films, which accommodates a construction of different designs of phantoms as necessary. In order to evaluate its performance, the developed phantom was tested by measuring the point dose and the percent depth dose (PDD) for small size fields under several heterogeneous conditions. However, the measurements with the two types of dosimeter did not agree well for the field sizes less than 1 x 1 cm 2 in muscle and bone, and less than 3 x 3 cm 2 in air cavity. Thus, it was recognized that several studies on small fields dosimetry and correction methods for the calculation with a PMCEPT code are needed. The under-estimated values from the ion chamber were corrected with a convolution method employed to eliminate the volume effect of the chamber. As a result, the discrepancies between the EBT film and the ion chamber measurements were significantly decreased, from 14% to 1% (1 x 1 cm 2 ), 10% to 1% (0.7 x 0.7 cm 2 ), and 42% to 7% (0.5 x 0

  16. A Smart High Accuracy Silicon Piezoresistive Pressure Sensor Temperature Compensation System

    Directory of Open Access Journals (Sweden)

    Guanwu Zhou

    2014-07-01

    Full Text Available Theoretical analysis in this paper indicates that the accuracy of a silicon piezoresistive pressure sensor is mainly affected by thermal drift, and varies nonlinearly with the temperature. Here, a smart temperature compensation system to reduce its effect on accuracy is proposed. Firstly, an effective conditioning circuit for signal processing and data acquisition is designed. The hardware to implement the system is fabricated. Then, a program is developed on LabVIEW which incorporates an extreme learning machine (ELM as the calibration algorithm for the pressure drift. The implementation of the algorithm was ported to a micro-control unit (MCU after calibration in the computer. Practical pressure measurement experiments are carried out to verify the system’s performance. The temperature compensation is solved in the interval from −40 to 85 °C. The compensated sensor is aimed at providing pressure measurement in oil-gas pipelines. Compared with other algorithms, ELM acquires higher accuracy and is more suitable for batch compensation because of its higher generalization and faster learning speed. The accuracy, linearity, zero temperature coefficient and sensitivity temperature coefficient of the tested sensor are 2.57% FS, 2.49% FS, 8.1 × 10−5/°C and 29.5 × 10−5/°C before compensation, and are improved to 0.13%FS, 0.15%FS, 1.17 × 10−5/°C and 2.1 × 10−5/°C respectively, after compensation. The experimental results demonstrate that the proposed system is valid for the temperature compensation and high accuracy requirement of the sensor.

  17. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  18. Dosimetric Uncertainties in Verification of Intensity Modulated Photon Beams

    International Nuclear Information System (INIS)

    Jurkovic, S.

    2010-01-01

    The doctoral thesis presents method for the calculation of the compensators' shape to modulate linear accelerators' beams. Characteristic of the method is more strict calculation of the scattered radiation in beams with an inhomogeneous cross-section than it was before. Method could be applied in various clinical situations. It's dosimetric verification was made in phantoms, measuring dose distributions using ionization chambers as well as radiographic film. Therefore, ionization chambers were used for the evaluation of modulator shape and film was used for the evaluation of two-dimensional dose distributions. It is well known that dosimetry of the intensity modulated photon beams is rather complicated regarding inhomogeneity of the dose distribution. The main reason for that is the beam modulator which changes spectral distribution of the beam. Possibility of use different types of detectors for the measurements of dose distributions in modulated photon beams and their accuracy were examined. Small volume ionization chambers, different diodes and amorphus silicon detector and radigraphic film were used. Measured dose distributions were compared between each other as well as with distributions simulated using Monte Carlo particle transport algorithm. In this way the most accurate method for the verification of modulate photon beams is suggested. (author)

  19. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  20. Monitoring/Verification Using DMS: TATP Example

    International Nuclear Information System (INIS)

    Kevin Kyle; Stephan Weeks

    2008-01-01

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a 'smart dust' sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15-300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements

  1. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  2. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  3. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    In the implementation of all arms control agreements, accurate verification is essential. In setting a course for verifying compliance with a given treaty - whether the NPT or the CWC, one must make a technical comparison of existing information-gathering capabilities against the constraints in an agreement. Then it must be decided whether this level of verifiability is good enough. Generally, the policy standard of 'effective verification' includes the ability to detect significant violations, with high confidence, in sufficient time to respond effectively with policy adjustments or other responses, as needed. It is at this juncture where verification approaches have traditionally diverged. Nuclear safeguards requirements have taken one path while chemical verification methods have pursued another. However, recent technological advances have brought a number of changes affecting verification, and lately their pace has been accelerating. First, all verification regimes have more and better information as a result of new kinds of sensors, imagery, and other technologies. Second, the verification provisions in agreements have also advanced, to include on-site inspections, portal monitoring, data exchanges, and a variety of transparency, confidence-building, and other cooperative measures, Together these developments translate into a technological overlap of certain institutional verification measures such as the NPT's safeguards requirements and the IAEA and the CWC's verification visions and the OPCW. Hence, a priority of international treaty-implementing organizations is exploring the development of a synergistic and coordinated approach to WMD policy making that takes into account existing inter-linkages between nuclear, chemical, and biological weapons issues. Specific areas of coordination include harmonizing information systems and information exchanges and the shared application of scientific mechanisms, as well as collaboration on technological developments

  4. SU-F-T-218: Validation of An In-Vivo Proton Range Verification Method for Reducing the Risk of Permanent Alopecia in the Treatment of Pediatric Medulloblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Lucconi, G [Department of Medical Physics, S. Orsola-Malpighi University Hospital, Bologna (Italy); Department of Radiation Oncology, Massachusetts General Hospital, Boston, MA (United States); Bentefour, E; Janssens, G [Advanced Technology Group, Ion Beam Applications (IBA), Louvain la Neuve (Belgium); Deepak, S [Department of Physics, Central University of Karnataka, Karnataka 585367 (India); Weaver, K; Moteabbed, M; Lu, H-M [Department of Radiation Oncology, Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: The clinical commissioning of a workflow for pre-treatment range verification/adjustment for the head treatment of pediatric medulloblastoma patients, including dose monitoring during treatment. Methods: An array of Si-diodes (DIODES Incorporated) is placed on the patient skin on the opposite side to the beam entrance. A “scout” SOBP beam, with a longer beam range to cover the diodes in its plateau, is delivered; the measured signal is analyzed and the extracted water equivalent path lengths (WEPL) are compared to the expected values, revealing if a range correction is needed. Diodes stay in place during treatment to measure dose. The workflow was tested in solid water and head phantoms and validated against independent WEPL measurements. Both measured WEPL and skin doses were compared to computed values from the TPS (XiO); a Markus chamber was used for reference dose measurements. Results: The WEPL accuracy of the method was verified by comparing it with the dose extinction method. It resulted, for both solid water and head phantom, in the sub-millimeter range, with a deviation less than 1% to the value extracted from the TPS. The accuracy of dose measurements in the fall-off part of the dose profile was validated against the Markus chamber. The entire range verification workflow was successfully tested for the mock-treatment of head phantom with the standard delivery of 90 cGy per field per fraction. The WEPL measurement revealed no need for range correction. The dose measurements agreed to better than 4% with the prescription dose. The robustness of the method and workflow, including detector array, hardware set and software functions, was successfully stress-tested with multiple repetitions. Conclusion: The performance of the in-vivo range verification system and related workflow meet the clinical requirements in terms of the needed WEPL accuracy for pretreatment range verification with acceptable dose to the patient.

  5. Development of a three-dimensional high-order strand-grids approach

    Science.gov (United States)

    Tong, Oisin

    Development of a novel high-order flux correction method on strand grids is presented. The method uses a combination of flux correction in the unstructured plane and summation-by-parts operators in the strand direction to achieve high-fidelity solutions. Low-order truncation errors are cancelled with accurate flux and solution gradients in the flux correction method, thereby achieving a formal order of accuracy of 3, although higher orders are often obtained, especially for highly viscous flows. In this work, the scheme is extended to high-Reynolds number computations in both two and three dimensions. Turbulence closure is achieved with a robust version of the Spalart-Allmaras turbulence model that accommodates negative values of the turbulence working variable, and the Menter SST turbulence model, which blends the k-epsilon and k-o turbulence models for better accuracy. A major advantage of this high-order formulation is the ability to implement traditional finite volume-like limiters to cleanly capture shocked and discontinuous flows. In this work, this approach is explored via a symmetric limited positive (SLIP) limiter. Extensive verification and validation is conducted in two and three dimensions to determine the accuracy and fidelity of the scheme for a number of different cases. Verification studies show that the scheme achieves better than third order accuracy for low and high-Reynolds number flows. Cost studies show that in three-dimensions, the third-order flux correction scheme requires only 30% more walltime than a traditional second-order scheme on strand grids to achieve the same level of convergence. In order to overcome meshing issues at sharp corners and other small-scale features, a unique approach to traditional geometry, coined "asymptotic geometry," is explored. Asymptotic geometry is achieved by filtering out small-scale features in a level set domain through min/max flow. This approach is combined with a curvature based strand shortening

  6. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  7. Ultra-high accuracy optical testing: creating diffraction-limitedshort-wavelength optical systems

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, Kenneth A.; Naulleau, Patrick P.; Rekawa, Senajith B.; Denham, Paul E.; Liddle, J. Alexander; Gullikson, Eric M.; Jackson, KeithH.; Anderson, Erik H.; Taylor, John S.; Sommargren, Gary E.; Chapman,Henry N.; Phillion, Donald W.; Johnson, Michael; Barty, Anton; Soufli,Regina; Spiller, Eberhard A.; Walton, Christopher C.; Bajt, Sasa

    2005-08-03

    Since 1993, research in the fabrication of extreme ultraviolet (EUV) optical imaging systems, conducted at Lawrence Berkeley National Laboratory (LBNL) and Lawrence Livermore National Laboratory (LLNL), has produced the highest resolution optical systems ever made. We have pioneered the development of ultra-high-accuracy optical testing and alignment methods, working at extreme ultraviolet wavelengths, and pushing wavefront-measuring interferometry into the 2-20-nm wavelength range (60-600 eV). These coherent measurement techniques, including lateral shearing interferometry and phase-shifting point-diffraction interferometry (PS/PDI) have achieved RMS wavefront measurement accuracies of 0.5-1-{angstrom} and better for primary aberration terms, enabling the creation of diffraction-limited EUV optics. The measurement accuracy is established using careful null-testing procedures, and has been verified repeatedly through high-resolution imaging. We believe these methods are broadly applicable to the advancement of short-wavelength optical systems including space telescopes, microscope objectives, projection lenses, synchrotron beamline optics, diffractive and holographic optics, and more. Measurements have been performed on a tunable undulator beamline at LBNL's Advanced Light Source (ALS), optimized for high coherent flux; although many of these techniques should be adaptable to alternative ultraviolet, EUV, and soft x-ray light sources. To date, we have measured nine prototype all-reflective EUV optical systems with NA values between 0.08 and 0.30 (f/6.25 to f/1.67). These projection-imaging lenses were created for the semiconductor industry's advanced research in EUV photolithography, a technology slated for introduction in 2009-13. This paper reviews the methods used and our program's accomplishments to date.

  8. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  9. Application of round grating angle measurement composite error amendment in the online measurement accuracy improvement of large diameter

    Science.gov (United States)

    Wang, Biao; Yu, Xiaofen; Li, Qinzhao; Zheng, Yu

    2008-10-01

    The paper aiming at the influence factor of round grating dividing error, rolling-wheel produce eccentricity and surface shape errors provides an amendment method based on rolling-wheel to get the composite error model which includes all influence factors above, and then corrects the non-circle measurement angle error of the rolling-wheel. We make soft simulation verification and have experiment; the result indicates that the composite error amendment method can improve the diameter measurement accuracy with rolling-wheel theory. It has wide application prospect for the measurement accuracy higher than 5 μm/m.

  10. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    Science.gov (United States)

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  11. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  12. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    Science.gov (United States)

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  13. Accuracy of field alignment in abdominal radiation therapy

    International Nuclear Information System (INIS)

    Kortmann, R. D.; Hess, C. F.; Meisner, C.; Schmidberger, H.; Bamberg, M.

    1996-01-01

    Purpose: To assess the accuracy of field alignment in a homogeneous group of patients undergoing radiotherapy of the abdomen (adjuvant treatment of the paraaortic region in Stage I testicular seminoma). To evaluate the predictive value of the first verification on field placement errors during subsequent treatment delivery. Methods and Materials: In 45 patients, linear and rotational discrepancies were measured between simulation and first check and between 10 consecutive verification films. Results: For the total group of patients, the distribution of all deviations showed mean values between 2.3 mm and -2.7 mm with standard deviations of 3.9 mm to 4.7 mm for linear discrepancies, and -0.5 deg. to 0.3 deg. with standard deviations of 1.2 deg. to 2.1 deg. for rotational discrepancies, respectively. For all patients, deviations for the transition from simulator to the treatment machine were similar to deviations during subsequent treatment delivery, with 95% of all absolute deviations < 10.0 mm and 4 deg. , respectively. When performing correlation analysis between deviations at first check and during treatment delivery, a correlation for lateral displacements and a borderline correlation for caudal displacements could be found. There was no correlation for cranial and rotational displacements. Conclusions: Although a trend of deviations for subsequent treatment delivery may be shown at first check, our analysis indicates that the first verification cannot reliably predict inaccuracies during treatment delivery. Random fluctuations of field displacements of up to 1.0 cm prevail. They must be considered when prescribing the safety margins of the planned target volume and determining cutoff points for corrective actions in abdominal radiation therapy

  14. Verification of the linac isocenter for stereotactic radiosurgery using cine-EPID imaging and arc delivery

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; O' Connor, Daryl J.; Greer, Peter B.

    2011-01-01

    Purpose:Verification of the mechanical isocenter position is required as part of comprehensive quality assurance programs for stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. Several techniques have been proposed for this purpose but each of them has certain drawbacks. In this paper, a new efficient and more comprehensive method using cine-EPID images has been introduced for automatic verification of the isocenter with sufficient accuracy for stereotactic applications. Methods: Using a circular collimator fixed to the gantry head to define the field, EPID images of a Winston-Lutz phantom were acquired in cine-imaging mode during 360 deg. gantry rotations. A robust matlab code was developed to analyze the data by finding the center of the field and the center of the ball bearing shadow in each image with sub-pixel accuracy. The distance between these two centers was determined for every image. The method was evaluated by comparison to results of a mechanical pointer and also by detection of a manual shift applied to the phantom position. The repeatability and reproducibility of the method were tested and it was also applied to detect couch and collimator wobble during rotation. Results:The accuracy of the algorithm was 0.03 ± 0.02 mm. The repeatability was less than 3 μm and the reproducibility was less than 86 μm. The time elapsed for the analysis of more than 100 cine images of Varian aS1000 and aS500 EPIDs were ∼65 and 20 s, respectively. Processing of images taken in integrated mode took 0.1 s. The output of the analysis software is printable and shows the isocenter shifts as a function of angle in both in-plane and cross-plane directions. It gives warning messages where the shifts exceed the criteria for SRS/SRT and provides useful data for the necessary adjustments in the system including bearing system and/or room lasers. Conclusions: The comprehensive method introduced in this study uses cine-images, is highly accurate, fast, and independent

  15. Verification of the linac isocenter for stereotactic radiosurgery using cine-EPID imaging and arc delivery

    Energy Technology Data Exchange (ETDEWEB)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; O' Connor, Daryl J.; Greer, Peter B. [School of Mathematical and Physical Sciences, University of Newcastle, Newcastle, New South Wales 2308 (Australia); Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales 2310, Australia and School of Mathematical and Physical Sciences, University of Newcastle, Newcastle, New South Wales 2308 (Australia)

    2011-07-15

    Purpose:Verification of the mechanical isocenter position is required as part of comprehensive quality assurance programs for stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. Several techniques have been proposed for this purpose but each of them has certain drawbacks. In this paper, a new efficient and more comprehensive method using cine-EPID images has been introduced for automatic verification of the isocenter with sufficient accuracy for stereotactic applications. Methods: Using a circular collimator fixed to the gantry head to define the field, EPID images of a Winston-Lutz phantom were acquired in cine-imaging mode during 360 deg. gantry rotations. A robust matlab code was developed to analyze the data by finding the center of the field and the center of the ball bearing shadow in each image with sub-pixel accuracy. The distance between these two centers was determined for every image. The method was evaluated by comparison to results of a mechanical pointer and also by detection of a manual shift applied to the phantom position. The repeatability and reproducibility of the method were tested and it was also applied to detect couch and collimator wobble during rotation. Results:The accuracy of the algorithm was 0.03 {+-} 0.02 mm. The repeatability was less than 3 {mu}m and the reproducibility was less than 86 {mu}m. The time elapsed for the analysis of more than 100 cine images of Varian aS1000 and aS500 EPIDs were {approx}65 and 20 s, respectively. Processing of images taken in integrated mode took 0.1 s. The output of the analysis software is printable and shows the isocenter shifts as a function of angle in both in-plane and cross-plane directions. It gives warning messages where the shifts exceed the criteria for SRS/SRT and provides useful data for the necessary adjustments in the system including bearing system and/or room lasers. Conclusions: The comprehensive method introduced in this study uses cine-images, is highly accurate, fast, and

  16. High-accuracy self-mixing interferometer based on single high-order orthogonally polarized feedback effects.

    Science.gov (United States)

    Zeng, Zhaoli; Qu, Xueming; Tan, Yidong; Tan, Runtao; Zhang, Shulian

    2015-06-29

    A simple and high-accuracy self-mixing interferometer based on single high-order orthogonally polarized feedback effects is presented. The single high-order feedback effect is realized when dual-frequency laser reflects numerous times in a Fabry-Perot cavity and then goes back to the laser resonator along the same route. In this case, two orthogonally polarized feedback fringes with nanoscale resolution are obtained. This self-mixing interferometer has the advantages of higher sensitivity to weak signal than that of conventional interferometer. In addition, two orthogonally polarized fringes are useful for discriminating the moving direction of measured object. The experiment of measuring 2.5nm step is conducted, which shows a great potential in nanometrology.

  17. Automated novel high-accuracy miniaturized positioning system for use in analytical instrumentation

    Science.gov (United States)

    Siomos, Konstadinos; Kaliakatsos, John; Apostolakis, Manolis; Lianakis, John; Duenow, Peter

    1996-01-01

    The development of three-dimensional automotive devices (micro-robots) for applications in analytical instrumentation, clinical chemical diagnostics and advanced laser optics, depends strongly on the ability of such a device: firstly to be positioned with high accuracy, reliability, and automatically, by means of user friendly interface techniques; secondly to be compact; and thirdly to operate under vacuum conditions, free of most of the problems connected with conventional micropositioners using stepping-motor gear techniques. The objective of this paper is to develop and construct a mechanically compact computer-based micropositioning system for coordinated motion in the X-Y-Z directions with: (1) a positioning accuracy of less than 1 micrometer, (the accuracy of the end-position of the system is controlled by a hard/software assembly using a self-constructed optical encoder); (2) a heat-free propulsion mechanism for vacuum operation; and (3) synchronized X-Y motion.

  18. Identification and delineation of areas flood hazard using high accuracy of DEM data

    Science.gov (United States)

    Riadi, B.; Barus, B.; Widiatmaka; Yanuar, M. J. P.; Pramudya, B.

    2018-05-01

    Flood incidents that often occur in Karawang regency need to be mitigated. These expectations exist on technologies that can predict, anticipate and reduce disaster risks. Flood modeling techniques using Digital Elevation Model (DEM) data can be applied in mitigation activities. High accuracy DEM data used in modeling, will result in better flooding flood models. The result of high accuracy DEM data processing will yield information about surface morphology which can be used to identify indication of flood hazard area. The purpose of this study was to identify and describe flood hazard areas by identifying wetland areas using DEM data and Landsat-8 images. TerraSAR-X high-resolution data is used to detect wetlands from landscapes, while land cover is identified by Landsat image data. The Topography Wetness Index (TWI) method is used to detect and identify wetland areas with basic DEM data, while for land cover analysis using Tasseled Cap Transformation (TCT) method. The result of TWI modeling yields information about potential land of flood. Overlay TWI map with land cover map that produces information that in Karawang regency the most vulnerable areas occur flooding in rice fields. The spatial accuracy of the flood hazard area in this study was 87%.

  19. Accuracy of Estimating Highly Eccentric Binary Black Hole Parameters with Gravitational-wave Detections

    Science.gov (United States)

    Gondán, László; Kocsis, Bence; Raffai, Péter; Frei, Zsolt

    2018-03-01

    Mergers of stellar-mass black holes on highly eccentric orbits are among the targets for ground-based gravitational-wave detectors, including LIGO, VIRGO, and KAGRA. These sources may commonly form through gravitational-wave emission in high-velocity dispersion systems or through the secular Kozai–Lidov mechanism in triple systems. Gravitational waves carry information about the binaries’ orbital parameters and source location. Using the Fisher matrix technique, we determine the measurement accuracy with which the LIGO–VIRGO–KAGRA network could measure the source parameters of eccentric binaries using a matched filtering search of the repeated burst and eccentric inspiral phases of the waveform. We account for general relativistic precession and the evolution of the orbital eccentricity and frequency during the inspiral. We find that the signal-to-noise ratio and the parameter measurement accuracy may be significantly higher for eccentric sources than for circular sources. This increase is sensitive to the initial pericenter distance, the initial eccentricity, and the component masses. For instance, compared to a 30 {M}ȯ –30 {M}ȯ non-spinning circular binary, the chirp mass and sky-localization accuracy can improve by a factor of ∼129 (38) and ∼2 (11) for an initially highly eccentric binary assuming an initial pericenter distance of 20 M tot (10 M tot).

  20. Verification of fluid-structure-interaction algorithms through the method of manufactured solutions for actuator-line applications

    Science.gov (United States)

    Vijayakumar, Ganesh; Sprague, Michael

    2017-11-01

    Demonstrating expected convergence rates with spatial- and temporal-grid refinement is the ``gold standard'' of code and algorithm verification. However, the lack of analytical solutions and generating manufactured solutions presents challenges for verifying codes for complex systems. The application of the method of manufactured solutions (MMS) for verification for coupled multi-physics phenomena like fluid-structure interaction (FSI) has only seen recent investigation. While many FSI algorithms for aeroelastic phenomena have focused on boundary-resolved CFD simulations, the actuator-line representation of the structure is widely used for FSI simulations in wind-energy research. In this work, we demonstrate the verification of an FSI algorithm using MMS for actuator-line CFD simulations with a simplified structural model. We use a manufactured solution for the fluid velocity field and the displacement of the SMD system. We demonstrate the convergence of both the fluid and structural solver to second-order accuracy with grid and time-step refinement. This work was funded by the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Wind Energy Technologies Office, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  1. A verification methodology for in vivo dosimetry in stereotactic radiotherapy; Uma metodologia para verificacao dosimetrica in vivo em radioterapia estereotaxica

    Energy Technology Data Exchange (ETDEWEB)

    Amaral, Leonardo L.; Oliveira, Harley F.; Fairbanks, Leandro R., E-mail: leonardo.fis@usp.br [Universidade de Sao Paulo (HCFMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Hospital das Clinicas; Nicolucci, Patricia; Netto, Thomaz G. [Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras. Departamento de Fisica

    2012-12-15

    Radiotherapy of brain lesions near critical structures requires a high accuracy in the location and dose. The high precision is achieved by the location of the stereotactic apparatus. The accuracy in dose delivery should be accompanied by an accurate quality control in devices that involve the practice, however, still does not guarantee the dose at the time of therapy. The large number of fields and the small size of these conventional methods difficult dosimetry during treatment. The objective of this work was to develop a verification methodology in vivo dosimetry in stereotactic radiotherapy with the aid of the film radiochromic Linear Accelerator with multi leaf collimators Moduleaf. The technique uses film segments radiochromic Gafchromic EBT2, with dimensions of 1x1 cm{sup 2} in area outside the coupled micro-multileaf Moduleaf Siemens. These films were inserted in the region of the central axis of the beam. The films were irradiated and calibrated to obtain the factors that determine the size dependence of the dosimetric field. With these data, we designed a computer program which calculates the density of a film must acquire when subjected to an exposure in this setting. This study evaluated five non-coplanar plans, the first with 15 fields and the other with 25 fields. Before starting the procedure, the film segment is coupled to the device, and after the treatment, the relative density is evaluated and compared with the calculated. The average value of the verification at the time of radiation dosimetry compared with the calculated by the sheet was 1.5%. The data collected in this study showed a satisfactory agreement between measured and calculated by the program in the densitometer. Thus, a methodology was developed to verify in vivo dosimetry in radiotherapy and stereotactic linear accelerator collimators Moduleaf. (author)

  2. Commissioning Procedures for Mechanical Precision and Accuracy in a Dedicated LINAC

    International Nuclear Information System (INIS)

    Ballesteros-Zebadua, P.; Larrga-Gutierrez, J. M.; Garcia-Garduno, O. A.; Juarez, J.; Prieto, I.; Moreno-Jimenez, S.; Celis, M. A.

    2008-01-01

    Mechanical precision measurements are fundamental procedures for the commissioning of a dedicated LINAC. At our Radioneurosurgery Unit, these procedures can be suitable as quality assurance routines that allow the verification of the equipment geometrical accuracy and precision. In this work mechanical tests were performed for gantry and table rotation, obtaining mean associated uncertainties of 0.3 mm and 0.71 mm, respectively. Using an anthropomorphic phantom and a series of localized surface markers, isocenter accuracy showed to be smaller than 0.86 mm for radiosurgery procedures and 0.95 mm for fractionated treatments with mask. All uncertainties were below tolerances. The highest contribution to mechanical variations is due to table rotation, so it is important to correct variations using a localization frame with printed overlays. Mechanical precision knowledge would allow to consider the statistical errors in the treatment planning volume margins

  3. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  4. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  5. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  6. Sustaining a verification regime in a nuclear weapon-free world. VERTIC research report no. 4

    International Nuclear Information System (INIS)

    Moyland, S. van

    1999-01-01

    Sustaining high levels of commitment to and enthusiasm for the verification regime in a nuclear weapon-free world (NWFW) would be a considerable challenge, but the price of failure would be high. No verification system for a complete ban on a whole of weapon of mass destruction (WMD) has been in existence long enough to provide a precedent or the requisite experience. Nevertheless, lessons from the International Atomic Energy Agency's (IAEA) nuclear safeguards system are instructive. A potential problem over the long haul is the gradual erosion of the deterrent effect of verification that may result from the continual overlooking of minor instances of non-compliance. Flaws in the verification system must be identified and dealt with early lest they also corrode the system. To achieve this the verification organisation's inspectors and analytical staff will need sustained support, encouragement, resources and training. In drawing attention to weaknesses, they must be supported by management and at the political level. The leaking of sensitive information, either industrial or military, by staff of the verification regime is a potential problem. 'Managed access' techniques should be constantly examined and improved. The verification organisation and states parties will need to sustain close co-operation with the nuclear and related industries. Frequent review mechanisms must be established. States must invest time and effort to make them effective. Another potential problem is the withering of resources for sustained verification. Verification organisations tend to be pressured by states to cut or last least cap costs, even if the verification workload increases. The verification system must be effective as knowledge and experience allows. The organisation will need continuously to update its scientific methods and technology. This requires in-house resources plus external research and development (R and D). Universities, laboratories and industry need incentives to

  7. Verification of space weather forecasts at the UK Met Office

    Science.gov (United States)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  8. High accuracy positioning using carrier-phases with the opensource GPSTK software

    OpenAIRE

    Salazar Hernández, Dagoberto José; Hernández Pajares, Manuel; Juan Zornoza, José Miguel; Sanz Subirana, Jaume

    2008-01-01

    The objective of this work is to show how using a proper GNSS data management strategy, combined with the flexibility provided by the open source "GPS Toolkit" (GPSTk), it is possible to easily develop both simple code-based processing strategies as well as basic high accuracy carrier-phase positioning techniques like Precise Point Positioning (PPP

  9. Variability in interhospital trauma data coding and scoring: A challenge to the accuracy of aggregated trauma registries.

    Science.gov (United States)

    Arabian, Sandra S; Marcus, Michael; Captain, Kevin; Pomphrey, Michelle; Breeze, Janis; Wolfe, Jennefer; Bugaev, Nikolay; Rabinovici, Reuven

    2015-09-01

    Analyses of data aggregated in state and national trauma registries provide the platform for clinical, research, development, and quality improvement efforts in trauma systems. However, the interhospital variability and accuracy in data abstraction and coding have not yet been directly evaluated. This multi-institutional, Web-based, anonymous study examines interhospital variability and accuracy in data coding and scoring by registrars. Eighty-two American College of Surgeons (ACS)/state-verified Level I and II trauma centers were invited to determine different data elements including diagnostic, procedure, and Abbreviated Injury Scale (AIS) coding as well as selected National Trauma Data Bank definitions for the same fictitious case. Variability and accuracy in data entries were assessed by the maximal percent agreement among the registrars for the tested data elements, and 95% confidence intervals were computed to compare this level of agreement to the ideal value of 100%. Variability and accuracy in all elements were compared (χ testing) based on Trauma Quality Improvement Program (TQIP) membership, level of trauma center, ACS verification, and registrar's certifications. Fifty registrars (61%) completed the survey. The overall accuracy for all tested elements was 64%. Variability was noted in all examined parameters except for the place of occurrence code in all groups and the lower extremity AIS code in Level II trauma centers and in the Certified Specialist in Trauma Registry- and Certified Abbreviated Injury Scale Specialist-certified registrar groups. No differences in variability were noted when groups were compared based on TQIP membership, level of center, ACS verification, and registrar's certifications, except for prehospital Glasgow Coma Scale (GCS), where TQIP respondents agreed more than non-TQIP centers (p = 0.004). There is variability and inaccuracy in interhospital data coding and scoring of injury information. This finding casts doubt on the

  10. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  11. Portal verification for breast cancer radiotherapy

    International Nuclear Information System (INIS)

    Petkovska, Sonja; Pejkovikj, Sasho; Apostolovski, Nebojsha

    2013-01-01

    At the University Clinic in Skopje, breast cancer irradiation is being planned and performed by using a mono-iso centrical method, which means that a unique isocenter (I C) for all irradiation fields is used. The goal of this paper is to present the patient’s position in all coordinates before the first treatment session, relative to the position determined during the CT simulation. Deviation of up to 5 mm is allowed. The analysis was made by using a portal verification. Sixty female patients at random selection are reviewed. The matching results show that for each patient deviation exists at least on one axis. The largest deviations are in the longitudinal direction (head-feet) up to 4 mm, mean 1.8 mm. In 60 out of 85 analysed fields, the deviation is towards the head. In lateral direction, median deviation is 1.1 mm and in 65% of the analysed portals those deviations are in medial direction – contralateral breast which can increases the dose in the lung and in the contralateral breast. This deviation for supraclavicular field can increase the dose in the spinal cord. Although these doses are well below the limit, this fact should be taken into account in setting the treatment fields. The final conclusion from the research is that despite of the fact we are dealing with small deviations, in conditions when accuracy in positioning is done with portal, the portal verification needs to be done in the coming weeks of the treatment, not only before the first treatment. This provides information for an intra fractional set-up deviation. (Author)

  12. A New Approach to High-accuracy Road Orthophoto Mapping Based on Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2011-12-01

    Full Text Available Existing orthophoto map based on satellite photography and aerial photography is not precise enough for road marking. This paper proposes a new approach to high-accuracy orthophoto mapping. The approach uses inverse perspective transformation to process the image information and generates the orthophoto fragment. The offline interpolation algorithm is used to process the location information. It processes the dead reckoning and the EKF location information, and uses the result to transform the fragments to the global coordinate system. At last it uses wavelet transform to divides the image to two frequency bands and uses weighted median algorithm to deal with them separately. The result of experiment shows that the map produced with this method has high accuracy.

  13. Experimental verification of the gas pumping theory within fission ionisation chambers

    International Nuclear Information System (INIS)

    Bartlett, A.C.

    1975-01-01

    Experimental verification of a theory for gas loss from in-core ionization chambers is reported. A value of the gas pressure within an irradiated miniature fission chamber was derived indirectly by use of published data on Townsend first coefficient/field across the detector as a function of field/pressure. In practice the voltage corresponding to 10% current multiplication is measured. From the current saturation characteristics measured on the detector during irradiation, the change in gas pressure as a function of fluence was derived and compared to theoretically predicted values. Within the limited accuracy obtainable substantial agreement between measurement and theory is obtained. (O.T.)

  14. A new ultra-high-accuracy angle generator: current status and future direction

    Science.gov (United States)

    Guertin, Christian F.; Geckeler, Ralf D.

    2017-09-01

    Lack of an extreme high-accuracy angular positioning device available in the United States has left a gap in industrial and scientific efforts conducted there, requiring certain user groups to undertake time-consuming work with overseas laboratories. Specifically, in x-ray mirror metrology the global research community is advancing the state-of-the-art to unprecedented levels. We aim to fill this U.S. gap by developing a versatile high-accuracy angle generator as a part of the national metrology tool set for x-ray mirror metrology and other important industries. Using an established calibration technique to measure the errors of the encoder scale graduations for full-rotation rotary encoders, we implemented an optimized arrangement of sensors positioned to minimize propagation of calibration errors. Our initial feasibility research shows that upon scaling to a full prototype and including additional calibration techniques we can expect to achieve uncertainties at the level of 0.01 arcsec (50 nrad) or better and offer the immense advantage of a highly automatable and customizable product to the commercial market.

  15. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  16. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    International Nuclear Information System (INIS)

    Magazzù, G; Borgese, G; Costantino, N; Fanucci, L; Saponara, S; Incandela, J

    2013-01-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  17. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  18. High-Precision Attitude Post-Processing and Initial Verification for the ZY-3 Satellite

    Directory of Open Access Journals (Sweden)

    Xinming Tang

    2014-12-01

    Full Text Available Attitude data, which is the important data strongly correlated with the geometric accuracy of optical remote sensing satellite images, are generally obtained using a real-time Extended Kalman Filter (EKF with star-tracker and gyro data for current high-resolution satellites, such as Orb-view, IKONOS, Quickbird,Pleiades, and ZY-3.We propose a forward-backward Unscented Kalman Filter (UKF for post-processing, and the proposed method employs UKF to suppress noise by using an unscented transformation (UT rather than an EKF in a nonlinear attitude system. Moreover, this method makes full use of the collected data in the fixed-interval and computational resources on the ground, and it determines optimal attitude results by forward-backward filtering and weighted smoothing with the raw star-tracker and gyro data collected for a fixed period. In this study, the principle and implementation of the proposed method are described. The post-processed attitude was compared with the on-board attitude, and the absolute accuracy was evaluated by the two methods. One method compares the positioning accuracy of the object space coordinates with the post-processed and on-board attitude data without using ground control points (GCPs. The other method compares the tie-point residuals of the image coordinates after a free net adjustment. In addition, the internal and external parameters of the camera were accurately calibrated before use for an objective evaluation of the attitude accuracy. The experimental results reveal that the accuracy of the post-processed attitude is superior to the accuracy of the on-board processed attitude. This method has been applied to the ZiYuan-3 satellite system for processing the raw star-tracker and gyro data daily.

  19. Ultra-high accuracy optical testing: creating diffraction-limited short-wavelength optical systems

    International Nuclear Information System (INIS)

    Goldberg, Kenneth A.; Naulleau, Patrick P.; Rekawa, Senajith B.; Denham, Paul E.; Liddle, J. Alexander; Gullikson, Eric M.; Jackson, KeithH.; Anderson, Erik H.; Taylor, John S.; Sommargren, Gary E.; Chapman, Henry N.; Phillion, Donald W.; Johnson, Michael; Barty, Anton; Soufli, Regina; Spiller, Eberhard A.; Walton, Christopher C.; Bajt, Sasa

    2005-01-01

    Since 1993, research in the fabrication of extreme ultraviolet (EUV) optical imaging systems, conducted at Lawrence Berkeley National Laboratory (LBNL) and Lawrence Livermore National Laboratory (LLNL), has produced the highest resolution optical systems ever made. We have pioneered the development of ultra-high-accuracy optical testing and alignment methods, working at extreme ultraviolet wavelengths, and pushing wavefront-measuring interferometry into the 2-20-nm wavelength range (60-600 eV). These coherent measurement techniques, including lateral shearing interferometry and phase-shifting point-diffraction interferometry (PS/PDI) have achieved RMS wavefront measurement accuracies of 0.5-1-(angstrom) and better for primary aberration terms, enabling the creation of diffraction-limited EUV optics. The measurement accuracy is established using careful null-testing procedures, and has been verified repeatedly through high-resolution imaging. We believe these methods are broadly applicable to the advancement of short-wavelength optical systems including space telescopes, microscope objectives, projection lenses, synchrotron beamline optics, diffractive and holographic optics, and more. Measurements have been performed on a tunable undulator beamline at LBNL's Advanced Light Source (ALS), optimized for high coherent flux; although many of these techniques should be adaptable to alternative ultraviolet, EUV, and soft x-ray light sources. To date, we have measured nine prototype all-reflective EUV optical systems with NA values between 0.08 and 0.30 (f/6.25 to f/1.67). These projection-imaging lenses were created for the semiconductor industry's advanced research in EUV photolithography, a technology slated for introduction in 2009-13. This paper reviews the methods used and our program's accomplishments to date

  20. Optimal metering plan for measurement and verification on a lighting case study

    International Nuclear Information System (INIS)

    Ye, Xianming; Xia, Xiaohua

    2016-01-01

    M&V (Measurement and Verification) has become an indispensable process in various incentive EEDSM (energy efficiency and demand side management) programmes to accurately and reliably measure and verify the project performance in terms of energy and/or cost savings. Due to the uncertain nature of the unmeasurable savings, there is an inherent trade-off between the M&V accuracy and M&V cost. In order to achieve the required M&V accuracy cost-effectively, we propose a combined spatial and longitudinal MCM (metering cost minimisation) model to assist the design of optimal M&V metering plans, which minimises the metering cost whilst satisfying the required measurement and sampling accuracy of M&V. The objective function of the proposed MCM model is the M&V metering cost that covers the procurement, installation and maintenance of the metering system whereas the M&V accuracy requirements are formulated as the constraints. Optimal solutions to the proposed MCM model offer useful information in designing the optimal M&V metering plan. The advantages of the proposed MCM model are demonstrated by a case study of an EE lighting retrofit project and the model is widely applicable to other M&V lighting projects with different population sizes and sampling accuracy requirements. - Highlights: • A combined spatial and longitudinal optimisation model is proposed to reduce M&V cost. • The combined optimisation model handles M&V sampling uncertainty cost-effectively. • The model exhibits a better performance than the separate spatial or longitudinal models. • The required 90/10 criterion sampling accuracy is satisfied for each M&V report.

  1. Self-verification and depression among youth psychiatric inpatients.

    Science.gov (United States)

    Joiner, T E; Katz, J; Lew, A S

    1997-11-01

    According to self-verification theory (e.g., W.B. Swann, 1983), people are motivated to preserve stable self-concepts by seeking self-confirming interpersonal responses, even if the responses are negative. In the current study of 72 youth psychiatric inpatients (36 boys; 36 girls; ages 7-17, M = 13.18; SD = 2.59), the authors provide the 1st test of self-verification theory among a youth sample. Participants completed self-report questionnaires on depression, self-esteem, anxiety, negative and positive affect, and interest in negative feedback from others. The authors made chart diagnoses available, and they collected peer rejection ratings. Consistent with hypotheses, the authors found that interest in negative feedback was associated with depression, was predictive of peer rejection (but only within relatively longer peer relationships), was more highly related to cognitive than emotional aspects of depression, and was specifically associated with depression, rather than being generally associated with emotional distress. The authors discuss implications for self-verification theory and for the phenomenology of youth depression.

  2. SNP Data Quality Control in a National Beef and Dairy Cattle System and Highly Accurate SNP Based Parentage Verification and Identification

    Directory of Open Access Journals (Sweden)

    Matthew C. McClure

    2018-03-01

    Full Text Available A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS, they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800 selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR, and minor allele frequency (MAF in the Irish cattle population. Large datasets require sample and SNP quality control (QC. Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present, and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non

  3. Impact of gantry rotation time on plan quality and dosimetric verification. Volumetric modulated arc therapy (VMAT) vs. intensity modulated radiotherapy (IMRT)

    Energy Technology Data Exchange (ETDEWEB)

    Pasler, Marlies; Wirtz, Holger; Lutterbach, Johannes [Gemeinschaftspraxis fuer Strahlentherapie Singen-Friedrichshafen, Singen (Germany)

    2011-12-15

    To compare plan quality criteria and dosimetric accuracy of step-and-shoot intensity-modulated radiotherapy (ss-IMRT) and volumetric modulated arc radiotherapy (VMAT) using two different gantry rotation times. This retrospective planning study based on 20 patients was comprised of 10 prostate cancer (PC) and 10 head and neck (HN) cancer cases. Each plan contained two target volumes: a primary planning target volume (PTV) and a boost volume. For each patient, one ss-IMRT plan and two VMAT plans at 90 s (VMAT90) and 120 s (VMAT120) per arc were generated with the Pinnacle {sup copyright} planning system. Two arcs were provided for the PTV plans and a single arc for boost volumes. Dosimetric verification of the plans was performed using a 2D ionization chamber array placed in a full scatter phantom. VMAT reduced delivery time and monitor units for both treatment sites compared to IMRT. VMAT120 vs. VMAT90 increased delivery time and monitor units in PC plans without improving plan quality. For HN cases, VMAT120 provided comparable organs at risk sparing and better target coverage and conformity than VMAT90. In the VMAT plan verification, an average of 97.1% of the detector points passed the 3 mm, 3% {gamma} criterion, while in IMRT verification it was 98.8%. VMAT90, VMAT120, and IMRT achieved comparable treatment plans. Slower gantry movement in VMAT120 plans only improves dosimetric quality for highly complex targets.

  4. Impact of gantry rotation time on plan quality and dosimetric verification. Volumetric modulated arc therapy (VMAT) vs. intensity modulated radiotherapy (IMRT)

    International Nuclear Information System (INIS)

    Pasler, Marlies; Wirtz, Holger; Lutterbach, Johannes

    2011-01-01

    To compare plan quality criteria and dosimetric accuracy of step-and-shoot intensity-modulated radiotherapy (ss-IMRT) and volumetric modulated arc radiotherapy (VMAT) using two different gantry rotation times. This retrospective planning study based on 20 patients was comprised of 10 prostate cancer (PC) and 10 head and neck (HN) cancer cases. Each plan contained two target volumes: a primary planning target volume (PTV) and a boost volume. For each patient, one ss-IMRT plan and two VMAT plans at 90 s (VMAT90) and 120 s (VMAT120) per arc were generated with the Pinnacle copyright planning system. Two arcs were provided for the PTV plans and a single arc for boost volumes. Dosimetric verification of the plans was performed using a 2D ionization chamber array placed in a full scatter phantom. VMAT reduced delivery time and monitor units for both treatment sites compared to IMRT. VMAT120 vs. VMAT90 increased delivery time and monitor units in PC plans without improving plan quality. For HN cases, VMAT120 provided comparable organs at risk sparing and better target coverage and conformity than VMAT90. In the VMAT plan verification, an average of 97.1% of the detector points passed the 3 mm, 3% γ criterion, while in IMRT verification it was 98.8%. VMAT90, VMAT120, and IMRT achieved comparable treatment plans. Slower gantry movement in VMAT120 plans only improves dosimetric quality for highly complex targets.

  5. Balancing dose and image registration accuracy for cone beam tomosynthesis (CBTS) for breast patient setup

    International Nuclear Information System (INIS)

    Winey, B. A.; Zygmanski, P.; Cormack, R. A.; Lyatskaya, Y.

    2010-01-01

    Purpose: To balance dose reduction and image registration accuracy in breast setup imaging. In particular, the authors demonstrate the relationship between scan angle and dose delivery for cone beam tomosynthesis (CBTS) when employed for setup verification of breast cancer patients with surgical clips. Methods: The dose measurements were performed in a female torso phantom for varying scan angles of CBTS. Setup accuracy was measured using three registration methods: Clip centroid localization accuracy and the accuracy of two semiautomatic registration algorithms. The dose to the organs outside of the ipsilateral breast and registration accuracy information were compared to determine the optimal scan angle for CBTS for breast patient setup verification. Isocenter positions at the center of the patient and at the breast-chest wall interface were considered. Results: Image registration accuracy was within 1 mm for the CBTS scan angles θ above 20 deg. for some scenarios and as large as 80 deg. for the worst case, depending on the imaged breast and registration algorithm. Registration accuracy was highest based on clip centroid localization. For left and right breast imaging with the isocenter at the chest wall, the dose to the contralateral side of the patient was very low (<0.5 cGy) for all scan angles considered. For central isocenter location, the optimal scan angles were 30 deg. - 50 deg. for the left breast imaging and 40 deg. - 50 deg. for the right breast imaging, with the difference due to the geometric asymmetry of the current clinical imaging system. Conclusions: The optimal scan angles for CBTS imaging were found to be between 10 deg. and 50 deg., depending on the isocenter location and ipsilateral breast. Use of the isocenter at the breast-chest wall locations always resulted in greater accuracy of image registration (<1 mm) at smaller angles (10 deg. - 20 deg.) and at lower doses (<0.1 cGy) to the contralateral organs. For chest wall isocenters, doses

  6. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    Science.gov (United States)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  7. High-accuracy dosimetry study for intensity-modulated radiation therapy(IMRT) commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Hae Sun

    2010-02-15

    Intensity-modulated radiation therapy (IMRT), an advanced modality of high-precision radiotherapy, allows for an increase in dose to the tumor volume without increasing the dose to nearby critical organs. In order to successfully achieve the treatment, intensive dosimetry with accurate dose verification is necessary. A dosimetry for IMRT, however, is a challenging task due to dosimetric ally unfavorable phenomena such as dramatic changes of the dose at the field boundaries, dis-equilibrium of the electrons, non-uniformity between the detector and the phantom materials, and distortion of scanner-read doses. In the present study, therefore, the LEGO-type multi-purpose dosimetry phantom was developed and used for the studies on dose measurements and correction. Phantom materials for muscle, fat, bone, and lung tissue were selected after considering mass density, atomic composition, effective atomic number, and photon interaction coefficients. The phantom also includes dosimeter holders for several different types of detectors including films, which accommodates a construction of different designs of phantoms as necessary. In order to evaluate its performance, the developed phantom was tested by measuring the point dose and the percent depth dose (PDD) for small size fields under several heterogeneous conditions. However, the measurements with the two types of dosimeter did not agree well for the field sizes less than 1 x 1 cm{sup 2} in muscle and bone, and less than 3 x 3 cm{sup 2} in air cavity. Thus, it was recognized that several studies on small fields dosimetry and correction methods for the calculation with a PMCEPT code are needed. The under-estimated values from the ion chamber were corrected with a convolution method employed to eliminate the volume effect of the chamber. As a result, the discrepancies between the EBT film and the ion chamber measurements were significantly decreased, from 14% to 1% (1 x 1 cm{sup 2}), 10% to 1% (0.7 x 0.7 cm{sup 2}), and 42

  8. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Science.gov (United States)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  9. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  10. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  11. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, SJ.

    2005-01-01

    Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. This paper describes development of an independent MU program, concentrating on the implementation of the Enhanced Dynamic Wedge (EDW) component. The difficult case of non centre of field (COF) calculation points under the EDW was studied in some detail. Results of a survey of Australasian centres regarding the use of independent MU check systems is also presented. The system was developed with reference to MU calculations made by Pinnacle 3 D Radiotherapy Treatment Planning (RTP) system (ADAC - Philips) for 4MV, 6MV and 18MV X-ray beams used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. A small systematic error was detected in the equation used for the EDW calculations. Results indicate that COF equations may be used in the non COF situation with similar accuracy to that achieved with profile corrected methods. Further collaborative work with other centres is planned to extend these findings

  12. Density scaling of phantom materials for a 3D dose verification system.

    Science.gov (United States)

    Tani, Kensuke; Fujita, Yukio; Wakita, Akihisa; Miyasaka, Ryohei; Uehara, Ryuzo; Kodama, Takumi; Suzuki, Yuya; Aikawa, Ako; Mizuno, Norifumi; Kawamori, Jiro; Saitoh, Hidetoshi

    2018-05-21

    In this study, the optimum density scaling factors of phantom materials for a commercially available three-dimensional (3D) dose verification system (Delta4) were investigated in order to improve the accuracy of the calculated dose distributions in the phantom materials. At field sizes of 10 × 10 and 5 × 5 cm 2 with the same geometry, tissue-phantom ratios (TPRs) in water, polymethyl methacrylate (PMMA), and Plastic Water Diagnostic Therapy (PWDT) were measured, and TPRs in various density scaling factors of water were calculated by Monte Carlo simulation, Adaptive Convolve (AdC, Pinnacle 3 ), Collapsed Cone Convolution (CCC, RayStation), and AcurosXB (AXB, Eclipse). Effective linear attenuation coefficients (μ eff ) were obtained from the TPRs. The ratios of μ eff in phantom and water ((μ eff ) pl,water ) were compared between the measurements and calculations. For each phantom material, the density scaling factor proposed in this study (DSF) was set to be the value providing a match between the calculated and measured (μ eff ) pl,water . The optimum density scaling factor was verified through the comparison of the dose distributions measured by Delta4 and calculated with three different density scaling factors: the nominal physical density (PD), nominal relative electron density (ED), and DSF. Three plans were used for the verifications: a static field of 10 × 10 cm 2 and two intensity modulated radiation therapy (IMRT) treatment plans. DSF were determined to be 1.13 for PMMA and 0.98 for PWDT. DSF for PMMA showed good agreement for AdC and CCC with 6 MV x ray, and AdC for 10 MV x ray. DSF for PWDT showed good agreement regardless of the dose calculation algorithms and x-ray energy. DSF can be considered one of the references for the density scaling factor of Delta4 phantom materials and may help improve the accuracy of the IMRT dose verification using Delta4. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  13. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EXEL INDUSTRIAL AIRMIX SPRAY GUN

    Science.gov (United States)

    The Environmental Technology Verification Program has partnered with Concurrent Technologies Corp. to verify innovative coatings and coating equipment technologies for reducing air emissions. This report describes the performance of EXEL Industrial's Kremlin Airmix high transfer ...

  15. Accuracy of simple plain radiographic signs and measures to diagnose acute scapholunate ligament injuries of the wrist

    Energy Technology Data Exchange (ETDEWEB)

    Dornberger, Jenny E. [Unfallkrankenhaus Berlin, Department of Plastic Surgery and Burn Care, Berlin (Germany); Rademacher, Grit; Mutze, Sven [Unfallkrankenhaus Berlin, Institute of Radiology, Berlin (Germany); Eisenschenk, Andreas [Unfallkrankenhaus Berlin, Department of Hand-, Replantation- and Microsurgery, Berlin (Germany); University Medicine Greifswald, Department of Hand Surgery and Microsurgery, Greifswald (Germany); Stengel, Dirk [Unfallkrankenhaus Berlin, Centre for Clinical Research, Berlin (Germany); Charite Medical University Centre, Julius Wolff Institute, Centre for Musculoskeletal Surgery, Berlin (Germany)

    2015-12-15

    To determine the accuracy of common radiological indices for diagnosing ruptures of the scapholunate (SL) ligament, the most relevant soft tissue injury of the wrist. This was a prospective diagnostic accuracy study with independent verification of index test findings by a reference standard (wrist arthroscopy). Bilateral digital radiographs in posteroanterior (pa), lateral and Stecher's projection were evaluated by two independent expert readers. Diagnostic accuracy of radiological signs was expressed as sensitivity, specificity, positive (PPV) and negative (NPV) predictive values with 95 % confidence intervals (CI). The prevalence of significant acute SL tears (grade ≥ III according to Geissler's classification) was 27/72 (38 %, 95 % CI 26-50 %). The SL distance on Stecher's projection proved the most accurate index to rule the presence of an SL rupture in and out. SL distance on plain pa radiographs, Stecher's projection and the radiolunate angle contributed independently to the final diagnostic model. These three simple indices explained 97 % of the diagnostic variance. In the era of computed tomography and magnetic resonance imaging, plain radiographs remain a highly sensitive and specific primary tool to triage patients with a suspected SL tear to further diagnostic work-up and surgical care. (orig.)

  16. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  17. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  18. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  19. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  20. National Scale Monitoring Reporting and Verification of Deforestation and Forest Degradation in Guyana

    Science.gov (United States)

    Bholanath, P.; Cort, K.

    2015-04-01

    Monitoring deforestation and forest degradation at national scale has been identified as a national priority under Guyana's REDD+ Programme. Based on Guyana's MRV (Monitoring Reporting and Verification) System Roadmap developed in 2009, Guyana sought to establish a comprehensive, national system to monitor, report and verify forest carbon emissions resulting from deforestation and forest degradation in Guyana. To date, four national annual assessments have been conducted: 2010, 2011, 2012 and 2013. Monitoring of forest change in 2010 was completed with medium resolution imagery, mainly Landsat 5. In 2011, assessment was conducted using a combination of Landsat (5 and 7) and for the first time, 5m high resolution imagery, with RapidEye coverage for approximately half of Guyana where majority of land use changes were taking place. Forest change in 2013 was determined using high resolution imagery for the whole of Guyana. The current method is an automated-assisted process of careful systematic manual interpretation of satellite imagery to identify deforestation based on different drivers of change. The minimum mapping unit (MMU) for deforestation is 1 ha (Guyana's forest definition) and a country-specific definition of 0.25 ha for degradation. The total forested area of Guyana is estimated as 18.39 million hectares (ha). In 2012 as planned, Guyana's forest area was reevaluated using RapidEye 5 m imagery. Deforestation in 2013 is estimated at 12 733 ha which equates to a total deforestation rate of 0.068%. Significant progress was made in 2012 and 2013, in mapping forest degradation. The area of forest degradation as measured by interpretation of 5 m RapidEye satellite imagery in 2013 was 4 352 ha. All results are subject to accuracy assessment and independent third party verification.

  1. High-accuracy user identification using EEG biometrics.

    Science.gov (United States)

    Koike-Akino, Toshiaki; Mahajan, Ruhi; Marks, Tim K; Ye Wang; Watanabe, Shinji; Tuzel, Oncel; Orlik, Philip

    2016-08-01

    We analyze brain waves acquired through a consumer-grade EEG device to investigate its capabilities for user identification and authentication. First, we show the statistical significance of the P300 component in event-related potential (ERP) data from 14-channel EEGs across 25 subjects. We then apply a variety of machine learning techniques, comparing the user identification performance of various different combinations of a dimensionality reduction technique followed by a classification algorithm. Experimental results show that an identification accuracy of 72% can be achieved using only a single 800 ms ERP epoch. In addition, we demonstrate that the user identification accuracy can be significantly improved to more than 96.7% by joint classification of multiple epochs.

  2. MO-AB-BRA-03: Development of Novel Real Time in Vivo EPID Treatment Verification for Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, G; Podesta, M [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Reniers, B [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Research Group NuTeC, CMK, Hasselt University, Agoralaan Gebouw H, Diepenbeek B-3590 (Belgium); Verhaegen, F [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Medical Physics Unit, Department of Oncology, McGill University, Montreal, Quebec H3G 1A4 (Canada)

    2016-06-15

    Purpose: High Dose Rate (HDR) brachytherapy treatments are employed worldwide to treat a wide variety of cancers. However, in vivo dose verification remains a challenge with no commercial dosimetry system available to verify the treatment dose delivered to the patient. We propose a novel dosimetry system that couples an independent Monte Carlo (MC) simulation platform and an amorphous silicon Electronic Portal Imaging Device (EPID) to provide real time treatment verification. Methods: MC calculations predict the EPID response to the photon fluence emitted by the HDR source by simulating the patient, the source dwell positions and times, and treatment complexities such as tissue compositions/densities and different applicators. Simulated results are then compared against EPID measurements acquired with ∼0.14s time resolution which allows dose measurements for each dwell position. The EPID has been calibrated using an Ir-192 HDR source and experiments were performed using different phantoms, including tissue equivalent materials (PMMA, lung and bone). A source positioning accuracy of 0.2 mm, without including the afterloader uncertainty, was ensured using a robotic arm moving the source. Results: An EPID can acquire 3D Cartesian source positions and its response varies significantly due to differences in the material composition/density of the irradiated object, allowing detection of changes in patient geometry. The panel time resolution allows dose rate and dwell time measurements. Moreover, predicted EPID images obtained from clinical treatment plans provide anatomical information that can be related to the patient anatomy, mostly bone and air cavities, localizing the source inside of the patient using its anatomy as reference. Conclusion: Results obtained show the feasibility of the proposed dose verification system that is capable to verify all the brachytherapy treatment steps in real time providing data about treatment delivery quality and also applicator

  3. Spatial Accuracy of Embedded Surface Coloring in Color 3D Printing

    DEFF Research Database (Denmark)

    Pedersen, David Bue; Hansen, Hans Nørgaard; Eiríksson, Eyþór Rúnar

    2015-01-01

    Measurement Machines(CMMʼs) and Machine Tools, that already hasbeen transferred to be applicable for AMmachine tools, [3] in order to determine the spatial accuracy of embedded color features to artifacts printed on a zCorp 650 color 3D Printer.The spatial color verification artifact is a flat platewith...... capable of full-color printing inpolymers[1]. Industrial service providers increasingly expand their product-range of full colorprint services, and as of today, the industry for full-color parts has grown rapidly, into a million-dollar industry [2]. With a new market emerging at such pace, it is believed...

  4. Project W-320, WRSS PCP: Procedure implementation verification

    International Nuclear Information System (INIS)

    Bailey, J.W.

    1998-01-01

    This document provides verification that the methodology for the safe retrieval of high-heat waste from Tank 241-C-106 as specified in the WRSS Process Control Plan HNF-SD-PCP-013, Revision 1, has been adequately implemented into the Tank Waste Remediation System (TWRS) operational procedures. Tank 241-C-106 is listed on the High Heat Load Watch List

  5. Verification of nuclear material balances: General theory and application to a highly enriched uranium fabrication plant

    International Nuclear Information System (INIS)

    Avenhaus, R.; Beedgen, R.; Neu, H.

    1980-08-01

    In the theoretical part it is shown that under the assumption, that in case of diversion the operator falsifies all data by a class specific amount, it is optimal in the sense of the probability of detection to use the difference MUF-D as the test statistics. However, as there are arguments for keeping the two tests separately, and furthermore, as it is not clear that the combined test statistics is optimal for any diversion strategy, the overall guaranteed probability of detection for the bivariate test is determined. A numerical example is given applying the theoretical part. Using the material balance data of a Highly Enriched Uranium fabrication plant the variances of MUF, D (no diversion) and MUF-D are calculated with the help of the standard deviations of operator and inspector measurements. The two inventories of the material balance are stratified. The samples sizes of the strata and the total inspection effort for data verification are determined by game theoretical methods (attribute sampling). On the basis of these results the overall detection probability of the combined system (data verification and material accountancy) is determined both for the MUF-D test and the bivariate (D,MUF) test as a function of the goal quantity. The results of both tests are evaluated for different diversion strategies. (orig./HP) [de

  6. High Accuracy Attitude Control System Design for Satellite with Flexible Appendages

    Directory of Open Access Journals (Sweden)

    Wenya Zhou

    2014-01-01

    Full Text Available In order to realize the high accuracy attitude control of satellite with flexible appendages, attitude control system consisting of the controller and structural filter was designed. When the low order vibration frequency of flexible appendages is approximating the bandwidth of attitude control system, the vibration signal will enter the control system through measurement device to bring impact on the accuracy or even the stability. In order to reduce the impact of vibration of appendages on the attitude control system, the structural filter is designed in terms of rejecting the vibration of flexible appendages. Considering the potential problem of in-orbit frequency variation of the flexible appendages, the design method for the adaptive notch filter is proposed based on the in-orbit identification technology. Finally, the simulation results are given to demonstrate the feasibility and effectiveness of the proposed design techniques.

  7. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  8. Land surface Verification Toolkit (LVT) - a generalized framework for land surface model evaluation

    Science.gov (United States)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J.; Harrison, K.; Liu, Y.; Shaw, M.

    2012-06-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  9. Verification tests for remote controlled inspection system in nuclear power plants

    International Nuclear Information System (INIS)

    Kohno, Tadaaki

    1986-01-01

    Following the increase of nuclear power plants, the total radiation exposure dose accompanying inspection and maintenance works tended to increase. Japan Power Engineering and Inspection Corp. carried out the verification test of a practical power reactor automatic inspection system from November, 1981, to March, 1986, and in this report, the state of having carried out this verification test is described. The objects of the verification test were the equipment which is urgently required for reducing radiation exposure dose, the possibility of realization of which is high, and which is important for ensuring the safety and reliability of plants, that is, an automatic ultrasonic flaw detector for the welded parts of bend pipes, an automatic disassembling and inspection system for control rod driving mechanism, a fuel automatic inspection system, and automatic decontaminating equipments for steam generator water chambers, primary system crud and radioactive gas in coolant. The results of the verification test of these equipments were judged as satisfactory, therefore, the application to actual plants is possible. (Kako, I.)

  10. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  11. Quantile Acoustic Vectors vs. MFCC Applied to Speaker Verification

    Directory of Open Access Journals (Sweden)

    Mayorga-Ortiz Pedro

    2014-02-01

    Full Text Available In this paper we describe speaker and command recognition related experiments, through quantile vectors and Gaussian Mixture Modelling (GMM. Over the past several years GMM and MFCC have become two of the dominant approaches for modelling speaker and speech recognition applications. However, memory and computational costs are important drawbacks, because autonomous systems suffer processing and power consumption constraints; thus, having a good trade-off between accuracy and computational requirements is mandatory. We decided to explore another approach (quantile vectors in several tasks and a comparison with MFCC was made. Quantile acoustic vectors are proposed for speaker verification and command recognition tasks and the results showed very good recognition efficiency. This method offered a good trade-off between computation times, characteristics vector complexity and overall achieved efficiency.

  12. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    International Nuclear Information System (INIS)

    Lips, Irene M; Dehnad, Homan; Gils, Carla H van; Boeken Kruger, Arto E; Heide, Uulke A van der; Vulpen, Marco van

    2008-01-01

    We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT) with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment) and weekly during treatment (acute toxicity) were scored using the Common Toxicity Criteria (CTC). The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC) scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU) complaints and 2% experienced grade 2 gastrointestinal (GI) complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4). In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used

  13. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    Directory of Open Access Journals (Sweden)

    Boeken Kruger Arto E

    2008-05-01

    Full Text Available Abstract We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment and weekly during treatment (acute toxicity were scored using the Common Toxicity Criteria (CTC. The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU complaints and 2% experienced grade 2 gastrointestinal (GI complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4. In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used.

  14. Technics study on high accuracy crush dressing and sharpening of diamond grinding wheel

    Science.gov (United States)

    Jia, Yunhai; Lu, Xuejun; Li, Jiangang; Zhu, Lixin; Song, Yingjie

    2011-05-01

    Mechanical grinding of artificial diamond grinding wheel was traditional wheel dressing process. The rotate speed and infeed depth of tool wheel were main technics parameters. The suitable technics parameters of metals-bonded diamond grinding wheel and resin-bonded diamond grinding wheel high accuracy crush dressing were obtained by a mount of experiment in super-hard material wheel dressing grind machine and by analysis of grinding force. In the same time, the effect of machine sharpening and sprinkle granule sharpening was contrasted. These analyses and lots of experiments had extent instruction significance to artificial diamond grinding wheel accuracy crush dressing.

  15. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  16. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  17. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  18. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  19. High accuracy subwavelength distance measurements: A variable-angle standing-wave total-internal-reflection optical microscope

    International Nuclear Information System (INIS)

    Haynie, A.; Min, T.-J.; Luan, L.; Mu, W.; Ketterson, J. B.

    2009-01-01

    We describe an extension of the total-internal-reflection microscopy technique that permits direct in-plane distance measurements with high accuracy (<10 nm) over a wide range of separations. This high position accuracy arises from the creation of a standing evanescent wave and the ability to sweep the nodal positions (intensity minima of the standing wave) in a controlled manner via both the incident angle and the relative phase of the incoming laser beams. Some control over the vertical resolution is available through the ability to scan the incoming angle and with it the evanescent penetration depth.

  20. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  1. Gamma-ray isotopic ratio measurements for the plutonium inventory verification program

    International Nuclear Information System (INIS)

    Lemming, J.F.; Haas, F.X.; Jarvis, J.Y.

    1976-01-01

    The Plutonium Inventory Verification Program at Mound Laboratory provides a nondestructive means of assaying bulk plutonium-bearing material. The assay is performed by combining the calorimetrically determined heat output of the sample and the relative abundances of the heat-producing isotopes. This report describes the method used for the nondestructive determination of plutonium-238, -240, -241 and americium-241 relative to plutonium-239 using gamma-ray spectroscopy for 93 percent plutonium-239 material. Comparison of chemical data on aliquots of samples to the nondestructive data shows accuracies of +-7 percent for 238 Pu/ 239 Pu, +-15 percent for 240 Pu/ 239 Pu, +- 3 percent for 241 Pu/ 239 Pu, and +-7 percent for 241 Am/ 239 Pu

  2. Verification station for Sandia/Rockwell Plutonium Protection system

    International Nuclear Information System (INIS)

    Nicholson, N.; Hastings, R.D.; Henry, C.N.; Millegan, D.R.

    1979-04-01

    A verification station has been designed to confirm the presence of plutonium within a container module. These container modules [about 13 cm (5 in.) in diameter and 23 cm (9 in.) high] hold sealed food-pack cans containing either plutonium oxide or metal and were designed by Sandia Laboratories to provide security and continuous surveillance and safety. After the plutonium is placed in the container module, it is closed with a solder seal. The verification station discussed here is used to confirm the presence of plutonium in the container module before it is placed in a carousel-type storage array inside the plutonium storage vault. This measurement represents the only technique that uses nuclear detectors in the plutonium protection system

  3. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  4. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  5. Towards the Verification of Human-Robot Teams

    Science.gov (United States)

    Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.

    2005-01-01

    Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.

  6. DIFFERENTIAL APPROACH TO URINARY SYNDROME VERIFICATION IN MEDICOPROPHYLACTIC FACILITIES IN CHILDREN WITH URINARY TRACT INFECTIONS

    Directory of Open Access Journals (Sweden)

    E.M. Pleshkova

    2011-01-01

    Full Text Available Urinary syndrome is an invariable and often the only manifestation of renal and urinary tract injury. Modern laboratory diagnostics prioritize prompt tests such as «dry chemistry» urine analysis using deep-stick tests. Study objective: to evaluate diagnostic accuracy of deep-stick tests in urinary syndrome verification in pediatric urinary tract infections (UTI. Methods: examination of a urinary sample using standard methods and prompt analysis with urine biochemical composition analyser among 66 children aging from 2 months to 16 years. From this group: 28 children had UTI and 38 other somatic diseases. Results: it has been shown that nitrite test-sticks have low diagnostic sensitivity — 69%, high prognostic value of a positive result (90% and high specificity (94%. Diagnostic sensitivity of leucocytic esterase is 73%, its’ prognostic value of a positive result — 92% and diagnostic specificity — 94%. Erythrocyteuria test had diagnostic sensitivity of 80% and specificity of 95%. Protein test had diagnostic sensitivity of 61% and prognostic value of 64% and 81% specificity. Conclusion: deep-stick test implementation with regard to specifications of this method will allow a more differential approach to it’s use in labs of medicoprophylactic facilities, also reduce the amount of time required for lab urine examinations, as well as to increase reliability of diagnostic information.Key words: children, urinary tract infections, stick-tests, «dry chemistry», diagnostic accuracy, method, urinalysis. (Voprosy sovremennoi pediatrii — Current Pediatrics. — 2011; 10 (6: 89–95

  7. Verification of the DUCT-III for calculation of high energy neutron streaming

    Energy Technology Data Exchange (ETDEWEB)

    Masukawa, Fumihiro; Nakano, Hideo; Nakashima, Hiroshi; Sasamoto, Nobuo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tayama, Ryu-ichi; Handa, Hiroyuki; Hayashi, Katsumi [Hitachi Engineering Co., Ltd., Hitachi, Ibaraki (Japan); Hirayama, Hideo [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan); Shin, Kazuo [Kyoto Univ., Kyoto (Japan)

    2003-03-01

    A large number of radiation streaming calculations under a variety of conditions are required as a part of shielding design for a high energy proton accelerator facility. Since sophisticated methods are very time consuming, simplified methods are employed in many cases. For accuracy evaluation of a simplified code DUCT-III for high energy neutron streaming calculations, two kinds of benchmark problems based on the experiments were analyzed. Through comparison of the DUCT-III calculations with both the measurements and the sophisticated Monte Carlo calculations, DUCT-III was seen reliable enough for applying to the shielding design for the Intense Proton Accelerator Facility. (author)

  8. Verification of the DUCT-III for calculation of high energy neutron streaming

    CERN Document Server

    Masukawa, F; Hayashi, K; Hirayama, H; Nakano, H; Nakashima, H; Sasamoto, N; Shin, K; Tayama, R I

    2003-01-01

    A large number of radiation streaming calculations under a variety of conditions are required as a part of shielding design for a high energy proton accelerator facility. Since sophisticated methods are very time consuming, simplified methods are employed in many cases. For accuracy evaluation of a simplified code DUCT-III for high energy neutron streaming calculations, two kinds of benchmark problems based on the experiments were analyzed. Through comparison of the DUCT-III calculations with both the measurements and the sophisticated Monte Carlo calculations, DUCT-III was seen reliable enough for applying to the shielding design for the Intense Proton Accelerator Facility.

  9. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    Science.gov (United States)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  10. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  11. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  12. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  13. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  14. Reduction of Cone-Beam CT scan time without compromising the accuracy of the image registration in IGRT

    DEFF Research Database (Denmark)

    Westberg, Jonas; Jensen, Henrik R; Bertelsen, Anders

    2010-01-01

    In modern radiotherapy accelerators are equipped with 3D cone-beam CT (CBCT) which is used to verify patient position before treatment. The verification is based on an image registration between the CBCT acquired just before treatment and the CT scan made for the treatment planning. The purpose...... of this study is to minimise the scan time of the CBCT without compromising the accuracy of the image registration in IGRT....

  15. Treatment accuracy of hypofractionated spine and other highly conformal IMRT treatments

    International Nuclear Information System (INIS)

    Sutherland, B.; Hanlon, P.; Charles, P.

    2011-01-01

    Full text: Spinal cord metastases pose difficult challenges for radiation treatment due to tight dose constraints and a concave PTY. This project aimed to thoroughly test the treatment accuracy of the Eclipse Treatment Planning System (TPS) for highly modulated IMRT treatments, in particular of the thoracic spine, using an Elekta Synergy Linear Accelerator. The increased understanding obtained through different quality assurance techniques allowed recommendations to be made for treatment site commissioning with improved accuracy at the Princess Alexandra Hospital (PAH). Three thoracic spine IMRT plans at the PAH were used for data collection. Complex phantom models were built using CT data, and fields simulated using Monte Carlo modelling. The simulated dose distributions were compared with the TPS using gamma analysis and DYH comparison. High resolution QA was done for all fields using the MatriXX ion chamber array, MapCHECK2 diode array shifted, and the EPlD to determine a procedure for commissioning new treatment sites. Basic spine simulations found the TPS overestimated absorbed dose to bone, however within spinal cord there was good agreement. High resolution QA found the average gamma pass rate of the fields to be 99.1 % for MatriXX, 96.5% for MapCHECK2 shifted and 97.7% for EPlD. Preliminary results indicate agreement between the TPS and delivered dose distributions higher than previously believed for the investigated IMRT plans. The poor resolution of the MatriXX, and normalisation issues with MapCHECK2 leads to probable recommendation of EPlD for future IMRT commissioning due to the high resolution and minimal setup required.

  16. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    International Nuclear Information System (INIS)

    Yamashita, M; Kokubo, M; Takahashi, R; Takayama, K; Tanabe, H; Sueoka, M; Okuuchi, N; Ishii, M; Iwamoto, Y; Tachibana, H

    2016-01-01

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  17. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    Energy Technology Data Exchange (ETDEWEB)

    Yamashita, M; Kokubo, M [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Takayama, K [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tanabe, H; Sueoka, M; Okuuchi, N [Institute of Biomedical Research and Innovation, Kobe, Hyogo (Japan); Ishii, M; Iwamoto, Y [Kobe City Medical Center General Hospital, Kobe, Hyogo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used for dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and

  18. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  19. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    International Nuclear Information System (INIS)

    Yang, D; Li, X; Li, H; Wooten, H; Green, O; Rodriguez, V; Mutic, S

    2014-01-01

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beam segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart

  20. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  1. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  2. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  3. Enhanced dynamic wedge and independent monitor unit verification

    International Nuclear Information System (INIS)

    Howlett, S.J.; University of Newcastle, NSW

    2004-01-01

    Full text: Some serious radiation accidents have occurred around the world during the delivery of radiotherapy treatment. The regrettable incident in Panama clearly indicated the need for independent monitor unit (MU) verification. Indeed the International Atomic Energy Agency (IAEA), after investigating the incident, made specific recommendations for radiotherapy centres which included an independent monitor unit check for all treatments. Independent monitor unit verification is practiced in many radiotherapy centres in developed countries around the world. It is mandatory in USA but not yet in Australia. The enhanced dynamic wedge factor (EDWF) presents some significant problems in accurate MU calculation, particularly in the case of non centre of field position (COF). This paper describes development of an independent MU program, concentrating on the implementation of the EDW component. The difficult case of non COF points under the EDW was studied in detail. A survey of Australasian centres regarding the use of independent MU check systems was conducted. The MUCalculator was developed with reference to MU calculations made by Pinnacle 3D RTP system (Philips) for 4MV, 6MV and 18MV X-ray beams from Varian machines used at the Newcastle Mater Misericordiae Hospital (NMMH) in the clinical environment. Ionisation chamber measurements in solid water TM and liquid water were performed based on a published test data set. Published algorithms combined with a depth dependent profile correction were applied in an attempt to match measured data with maximum accuracy. The survey results are presented. Substantial data is presented in tabular form and extensive comparison with published data. Several different methods for calculating EDWF are examined. A small systematic error was detected in the Gibbon equation used for the EDW calculations. Generally, calculations were within +2% of measured values, although some setups exceeded this variation. Results indicate that COF

  4. High accuracy of family history of melanoma in Danish melanoma cases

    DEFF Research Database (Denmark)

    Wadt, Karin A W; Drzewiecki, Krzysztof T; Gerdes, Anne-Marie

    2015-01-01

    The incidence of melanoma in Denmark has immensely increased over the last 10 years making Denmark a high risk country for melanoma. In the last two decades multiple public campaigns have sought to increase the awareness of melanoma. Family history of melanoma is a known major risk factor...... but previous studies have shown that self-reported family history of melanoma is highly inaccurate. These studies are 15 years old and we wanted to examine if a higher awareness of melanoma has increased the accuracy of self-reported family history of melanoma. We examined the family history of 181 melanoma...

  5. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    International Nuclear Information System (INIS)

    J Zwan, B; Colvill, E; Booth, J; J O’Connor, D; Keall, P; B Greer, P

    2016-01-01

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3) field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm 2 (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.

  6. IAEA verification experiment at the Portsmouth Gaseous Diffusion Plant

    International Nuclear Information System (INIS)

    Gordon, D.M.; Subudhi, M.; Calvert, O.L.; Bonner, T.N.; Cherry, R.C.; Whiting, N.E.

    1998-01-01

    In April 1996, the United States (US) added the Portsmouth Gaseous Diffusion Plant to the list of facilities eligible for the application of International Atomic Energy Agency (IAEA) safeguards. At that time, the US proposed that the IAEA carry out a Verification Experiment at the plant with respect to the downblending of about 13 metric tons of highly enriched uranium (HEU) in the form of UF 6 . This material is part of the 226 metric tons of fissile material that President Clinton has declared to be excess to US national-security needs and which will be permanently withdrawn from the US nuclear stockpile. In September 1997, the IAEA agreed to carry out this experiment, and during the first three weeks of December 1997, the IAEA verified the design information concerning the downblending process. The plant has been subject to short-notice random inspections since December 17, 1997. This paper provides an overview of the Verification Experiment, the monitoring technologies used in the verification approach, and some of the experience gained to date

  7. Simulation of GNSS reflected signals and estimation of position accuracy in GNSS-challenged environment

    DEFF Research Database (Denmark)

    Jakobsen, Jakob; Jensen, Anna B. O.; Nielsen, Allan Aasbjerg

    2015-01-01

    non-line-of-sight satellites. The signal reflections are implemented using the extended geometric path length of the signal path caused by reflections from the surrounding buildings. Based on real GPS satellite positions, simulated Galileo satellite positions, models of atmospheric effect...... on the satellite signals, designs of representative environments e.g. urban and rural scenarios, and a method to simulate reflection of satellite signals within the environment we are able to estimate the position accuracy given several prerequisites as described in the paper. The result is a modelling...... of the signal path from satellite to receiver, the satellite availability, the extended pseudoranges caused by signal reflection, and an estimate of the position accuracy based on a least squares adjustment of the extended pseudoranges. The paper describes the models and algorithms used and a verification test...

  8. SU-E-T-49: A Multi-Institutional Study of Independent Dose Verification for IMRT

    International Nuclear Information System (INIS)

    Baba, H; Tachibana, H; Kamima, T; Takahashi, R; Kawai, D; Sugawara, Y; Yamamoto, T; Sato, A; Yamashita, M

    2015-01-01

    Purpose: AAPM TG114 does not cover the independent verification for IMRT. We conducted a study of independent dose verification for IMRT in seven institutes to show the feasibility. Methods: 384 IMRT plans in the sites of prostate and head and neck (HN) were collected from the institutes, where the planning was performed using Eclipse and Pinnacle3 with the two techniques of step and shoot (S&S) and sliding window (SW). All of the institutes used a same independent dose verification software program (Simple MU Analysis: SMU, Triangle Product, Ishikawa, JP), which is Clarkson-based and CT images were used to compute radiological path length. An ion-chamber measurement in a water-equivalent slab phantom was performed to compare the doses computed using the TPS and an independent dose verification program. Additionally, the agreement in dose computed in patient CT images between using the TPS and using the SMU was assessed. The dose of the composite beams in the plan was evaluated. Results: The agreement between the measurement and the SMU were −2.3±1.9 % and −5.6±3.6 % for prostate and HN sites, respectively. The agreement between the TPSs and the SMU were −2.1±1.9 % and −3.0±3.7 for prostate and HN sites, respectively. There was a negative systematic difference with similar standard deviation and the difference was larger in the HN site. The S&S technique showed a statistically significant difference between the SW. Because the Clarkson-based method in the independent program underestimated (cannot consider) the dose under the MLC. Conclusion: The accuracy would be improved when the Clarkson-based algorithm should be modified for IMRT and the tolerance level would be within 5%

  9. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  10. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  11. Efficient Verification of Holograms Using Mobile Augmented Reality.

    Science.gov (United States)

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  12. Real-time monitoring and verification of in vivo high dose rate brachytherapy using a pinhole camera

    International Nuclear Information System (INIS)

    Duan, Jun; Macey, Daniel J.; Pareek, Prem N.; Brezovich, Ivan A.

    2001-01-01

    We investigated a pinhole imaging system for independent in vivo monitoring and verification of high dose rate (HDR) brachytherapy treatment. The system consists of a high-resolution pinhole collimator, an x-ray fluoroscope, and a standard radiographic screen-film combination. Autofluoroscopy provides real-time images of the in vivo Ir-192 HDR source for monitoring the source location and movement, whereas autoradiography generates a permanent record of source positions on film. Dual-pinhole autoradiographs render stereo-shifted source images that can be used to reconstruct the source dwell positions in three dimensions. The dynamic range and spatial resolution of the system were studied with a polystyrene phantom using a range of source strengths and dwell times. For the range of source activity used in HDR brachytherapy, a 0.5 mm diameter pinhole produced sharp fluoroscopic images of the source within the dynamic range of the fluoroscope. With a source-to-film distance of 35 cm and a 400 speed screen-film combination, the same pinhole yielded well recognizable images of a 281.2 GBq (7.60 Ci) Ir-192 source for dwell times in the typical clinical range of 2 to 400 s. This 0.5 mm diameter pinhole could clearly resolve source positions separated by lateral displacements as small as 1 mm. Using a simple reconstruction algorithm, dwell positions in a phantom were derived from stereo-shifted dual-pinhole images and compared to the known positions. The agreement was better than 1 mm. A preliminary study of a patient undergoing HDR treatment for cervical cancer suggests that the imaging method is clinically feasible. Based on these studies we believe that the pinhole imaging method is capable of providing independent and reliable real-time monitoring and verification for HDR brachytherapy

  13. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  14. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  15. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  16. Plutonium characterisation with prompt high energy gamma-rays from (n,gamma) reactions for nuclear warhead dismantlement verification

    Energy Technology Data Exchange (ETDEWEB)

    Postelt, Frederik; Gerald, Kirchner [Carl Friedrich von Weizsaecker-Centre for Science and Peace Research, Hamburg (Germany)

    2015-07-01

    Measurements of neutron induced gammas allow the characterisation of fissile material (i.e. plutonium and uranium), despite self- and additional shielding. Most prompt gamma-rays from radiative neutron capture reactions in fissile material have energies between 3 and 6.5 MeV. Such high energy photons have a high penetrability and therefore minimise shielding and self-absorption effects. They are also isotope specific and therefore well suited to determine the isotopic composition of fissile material. As they are non-destructive, their application in dismantlement verification is desirable. Disadvantages are low detector efficiencies at high gamma energies, as well as a high background of gammas which result from induced fission reactions in the fissile material, as well as delayed gammas from both, (n,f) and(n,gamma) reactions. In this talk, simulations of (n,gamma) measurements and their implications are presented. Their potential for characterising fissile material is assessed and open questions are addressed.

  17. Acoustic time-of-flight for proton range verification in water

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kevin C.; Avery, Stephen, E-mail: Stephen.Avery@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Vander Stappen, François [Ion Beam Applications SA, Louvain-la-Neuve 1348 (Belgium); Sehgal, Chandra M. [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2016-09-15

    Purpose: Measurement of the arrival times of thermoacoustic waves induced by pulsed proton dose depositions (protoacoustics) may provide a proton range verification method. The goal of this study is to characterize the required dose and protoacoustic proton range (distance) verification accuracy in a homogeneous water medium at a hospital-based clinical cyclotron. Methods: Gaussian-like proton pulses with 17 μs widths and instantaneous currents of 480 nA (5.6 × 10{sup 7} protons/pulse, 3.4 cGy/pulse at the Bragg peak) were generated by modulating the cyclotron proton source with a function generator. After energy degradation, the 190 MeV proton pulses irradiated a water phantom, and the generated protoacoustic emissions were measured by a hydrophone. The detector position and proton pulse characteristics were varied. The experimental results were compared to simulations. Different arrival time metrics derived from acoustic waveforms were compared, and the accuracy of protoacoustic time-of-flight distance calculations was assessed. Results: A 27 mPa noise level was observed in the treatment room during irradiation. At 5 cm from the proton beam, an average maximum pressure of 5.2 mPa/1 × 10{sup 7} protons (6.1 mGy at the Bragg peak) was measured after irradiation with a proton pulse with 10%–90% rise time of 11 μs. Simulation and experiment arrival times agreed well, and the observed 2.4 μs delay between simulation and experiment is attributed to the difference between the hydrophone’s acoustic and geometric centers. Based on protoacoustic arrival times, the beam axis position was measured to within (x, y) = (−2.0,  0.5) ± 1 mm. After deconvolution of the exciting proton pulse, the protoacoustic compression peak provided the most consistent measure of the distance to the Bragg peak, with an error distribution with mean = − 4.5 mm and standard deviation = 2.0 mm. Conclusions: Based on water tank measurements at a clinical hospital-based cyclotron

  18. Verification of rapid method for estimation of added food colorant type in boiled sausages based on measurement of cross section color

    Science.gov (United States)

    Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.

    2017-09-01

    During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.

  19. Simulations of pulsating one-dimensional detonations with true fifth order accuracy

    International Nuclear Information System (INIS)

    Henrick, Andrew K.; Aslam, Tariq D.; Powers, Joseph M.

    2006-01-01

    A novel, highly accurate numerical scheme based on shock-fitting coupled with fifth order spatial and temporal discretizations is applied to a classical unsteady detonation problem to generate solutions with unprecedented accuracy. The one-dimensional reactive Euler equations for a calorically perfect mixture of ideal gases whose reaction is described by single-step irreversible Arrhenius kinetics are solved in a series of calculations in which the activation energy is varied. In contrast with nearly all known simulations of this problem, which converge at a rate no greater than first order as the spatial and temporal grid is refined, the present method is shown to converge at a rate consistent with the fifth order accuracy of the spatial and temporal discretization schemes. This high accuracy enables more precise verification of known results and prediction of heretofore unknown phenomena. To five significant figures, the scheme faithfully recovers the stability boundary, growth rates, and wave-numbers predicted by an independent linear stability theory in the stable and weakly unstable regime. As the activation energy is increased, a series of period-doubling events are predicted, and the system undergoes a transition to chaos. Consistent with general theories of non-linear dynamics, the bifurcation points are seen to converge at a rate for which the Feigenbaum constant is 4.66 ± 0.09, in close agreement with the true value of 4.669201... As activation energy is increased further, domains are identified in which the system undergoes a transition from a chaotic state back to one whose limit cycles are characterized by a small number of non-linear oscillatory modes. This result is consistent with behavior of other non-linear dynamical systems, but not typically considered in detonation dynamics. The period and average detonation velocity are calculated for a variety of asymptotically stable limit cycles. The average velocity for such pulsating detonations is found

  20. High-Accuracy Elevation Data at Large Scales from Airborne Single-Pass SAR Interferometry

    Directory of Open Access Journals (Sweden)

    Guy Jean-Pierre Schumann

    2016-01-01

    Full Text Available Digital elevation models (DEMs are essential data sets for disaster risk management and humanitarian relief services as well as many environmental process models. At present, on the hand, globally available DEMs only meet the basic requirements and for many services and modeling studies are not of high enough spatial resolution and lack accuracy in the vertical. On the other hand, LiDAR-DEMs are of very high spatial resolution and great vertical accuracy but acquisition operations can be very costly for spatial scales larger than a couple of hundred square km and also have severe limitations in wetland areas and under cloudy and rainy conditions. The ideal situation would thus be to have a DEM technology that allows larger spatial coverage than LiDAR but without compromising resolution and vertical accuracy and still performing under some adverse weather conditions and at a reasonable cost. In this paper, we present a novel single pass In-SAR technology for airborne vehicles that is cost-effective and can generate DEMs with a vertical error of around 0.3 m for an average spatial resolution of 3 m. To demonstrate this capability, we compare a sample single-pass In-SAR Ka-band DEM of the California Central Valley from the NASA/JPL airborne GLISTIN-A to a high-resolution LiDAR DEM. We also perform a simple sensitivity analysis to floodplain inundation. Based on the findings of our analysis, we argue that this type of technology can and should be used to replace large regions of globally available lower resolution DEMs, particularly in coastal, delta and floodplain areas where a high number of assets, habitats and lives are at risk from natural disasters. We conclude with a discussion on requirements, advantages and caveats in terms of instrument and data processing.

  1. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  2. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  3. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  4. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  5. Verification of Thermal Models of Internally Cooled Gas Turbine Blades

    Directory of Open Access Journals (Sweden)

    Igor Shevchenko

    2018-01-01

    Full Text Available Numerical simulation of temperature field of cooled turbine blades is a required element of gas turbine engine design process. The verification is usually performed on the basis of results of test of full-size blade prototype on a gas-dynamic test bench. A method of calorimetric measurement in a molten metal thermostat for verification of a thermal model of cooled blade is proposed in this paper. The method allows obtaining local values of heat flux in each point of blade surface within a single experiment. The error of determination of local heat transfer coefficients using this method does not exceed 8% for blades with radial channels. An important feature of the method is that the heat load remains unchanged during the experiment and the blade outer surface temperature equals zinc melting point. The verification of thermal-hydraulic model of high-pressure turbine blade with cooling allowing asymmetrical heat removal from pressure and suction sides was carried out using the developed method. An analysis of heat transfer coefficients confirmed the high level of heat transfer in the leading edge, whose value is comparable with jet impingement heat transfer. The maximum of the heat transfer coefficients is shifted from the critical point of the leading edge to the pressure side.

  6. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  7. Perspex in the verification routines for accelerator beam

    International Nuclear Information System (INIS)

    Paredes G, L.; Genis S, R.

    1998-01-01

    It is analyzed the use of a perspex solid phantom, adequately referred to a water phantom, as an auxiliary alternative for the daily stability verification routines or constance of radiation beam, as an option in the case of radiotherapy installations with high charge of accelerator working and with basic dosimetry equipment. (Author)

  8. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  9. Testing, verification and application of CONTAIN for severe accident analysis of LMFBR-containments

    International Nuclear Information System (INIS)

    Langhans, J.

    1991-01-01

    Severe accident analysis for LMFBR-containments has to consider various phenomena influencing the development of containment loads as pressure and temperatures as well as generation, transport, depletion and release of aerosols and radioactive materials. As most of the different phenomena are linked together their feedback has to be taken into account within the calculation of severe accident consequences. Otherwise no best-estimate results can be assured. Under the sponsorship of the German BMFT the US code CONTAIN is being developed, verified and applied in GRS for future fast breeder reactor concepts. In the first step of verification, the basic calculation models of a containment code have been proven: (i) flow calculation for different flow situations, (ii) heat transfer from and to structures, (iii) coolant evaporation, boiling and condensation, (iv) material properties. In the second step the proof of the interaction of coupled phenomena has been checked. The calculation of integrated containment experiments relating natural convection flow, structure heating and coolant condensation as well as parallel calculation of results obtained with an other code give detailed information on the applicability of CONTAIN. The actual verification status allows the following conclusion: a caucious analyst experienced in containment accident modelling using the proven parts of CONTAIN will obtain results which have the same accuracy as other well optimized and detailed lumped parameter containment codes can achieve. Further code development, additional verification and international exchange of experience and results will assure an adequate code for the application in safety analyses for LMFBRs. (orig.)

  10. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  11. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  12. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  13. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  14. New possibilities of digital luminescence radiography (DLR) and digital image processing for verification and portal imaging

    International Nuclear Information System (INIS)

    Zimmermann, J.S.; Blume, J.; Wendhausen, H.; Hebbinghaus, D.; Kovacs, G.; Eilf, K.; Schultze, J.; Kimmig, B.N.

    1995-01-01

    We developed a method, using digital luminescence radiography (DLR), not only for portal imaging of photon beams in an excellent quality, but also for verification of electron beams. Furtheron, DLR was used as basic instrument for image fusion of portal and verification film and simulation film respectively for image processing in ''beams-eye-view'' verification (BEVV) of rotating beams or conformation therapy. Digital radiographs of an excellent quality are gained for verification of photon and electron beams. In photon beams, quality improvement vs. conventional portal imaging may be dramatic, even more for high energy beams (e.g. 15-MV-photon beams) than for Co-60. In electron beams, excellent results may be easily obtained. By digital image fusion of 1 or more verification films on simulation film or MRI-planning film, more precise judgement even on small differences between simulation and verification films becomes possible. Using BEVV, it is possible to compare computer aided simulation in rotating beams or conformation therapy with the really applied treatment. The basic principle of BEVV is also suitable for dynamic multileaf collimation. (orig.) [de

  15. High-accuracy drilling with an image guided light weight robot: autonomous versus intuitive feed control.

    Science.gov (United States)

    Tauscher, Sebastian; Fuchs, Alexander; Baier, Fabian; Kahrs, Lüder A; Ortmaier, Tobias

    2017-10-01

    Assistance of robotic systems in the operating room promises higher accuracy and, hence, demanding surgical interventions become realisable (e.g. the direct cochlear access). Additionally, an intuitive user interface is crucial for the use of robots in surgery. Torque sensors in the joints can be employed for intuitive interaction concepts. Regarding the accuracy, they lead to a lower structural stiffness and, thus, to an additional error source. The aim of this contribution is to examine, if an accuracy needed for demanding interventions can be achieved by such a system or not. Feasible accuracy results of the robot-assisted process depend on each work-flow step. This work focuses on the determination of the tool coordinate frame. A method for drill axis definition is implemented and analysed. Furthermore, a concept of admittance feed control is developed. This allows the user to control feeding along the planned path by applying a force to the robots structure. The accuracy is researched by drilling experiments with a PMMA phantom and artificial bone blocks. The described drill axis estimation process results in a high angular repeatability ([Formula: see text]). In the first set of drilling results, an accuracy of [Formula: see text] at entrance and [Formula: see text] at target point excluding imaging was achieved. With admittance feed control an accuracy of [Formula: see text] at target point was realised. In a third set twelve holes were drilled in artificial temporal bone phantoms including imaging. In this set-up an error of [Formula: see text] and [Formula: see text] was achieved. The results of conducted experiments show that accuracy requirements for demanding procedures such as the direct cochlear access can be fulfilled with compliant systems. Furthermore, it was shown that with the presented admittance feed control an accuracy of less then [Formula: see text] is achievable.

  16. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    Science.gov (United States)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  17. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  18. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  19. System for verification in situ of current transformers in high voltage substations; Sistema para verificacao in situ de transformadores de corrente em substacoes de alta tensao

    Energy Technology Data Exchange (ETDEWEB)

    Mendonca, Pedro Henrique; Costa, Marcelo M. da; Dahlke, Diogo B.; Ikeda, Minoru [LACTEC - Instituto de Tecnologia para o Desenvolvimento, Curitiba, PR (Brazil)], Emails: pedro.henrique@lactec.org.br, arinos@lactec.org.br, diogo@lactec.org.br, minoru@lactec.org.br, Celso.melo@copel.com; Carvalho, Joao Claudio D. de [ELETRONORTE, Belem, PR (Brazil)], E-mail: marcelo.melo@eln.gov.br; Teixeira Junior, Jose Arinos [ELETROSUL, Florianopolis, SC (Brazil)], E-mail: jclaudio@eletrosul.gov.br; Melo, Celso F. [Companhia Paranaense de Energia (COPEL), Curitiba, PR (Brazil)], E-mail: Celso.melo@copel.com

    2009-07-01

    This work presents an alternative proposal to the execute the calibration of conventional current transformer at the field, using a verification system composed by a optical current transformer as a reference standard, able to installation in extra high voltage bars.

  20. Fault-specific verification (FSV) - An alternative VV ampersand T strategy for high reliability nuclear software systems

    International Nuclear Information System (INIS)

    Miller, L.A.

    1994-01-01

    The author puts forth an argument that digital instrumentation and control systems can be safely applied in the nuclear industry, but it will require changes to the way software for such systems is developed and tested. He argues for a fault-specific verification procedure to be applied to software development. This plan includes enumerating and classifying all software faults at all levels of the product development, over the whole development process. While collecting this data, develop and validate different methods for software verification, validation and testing, and apply them against all the detected faults. Force all of this development toward an automated product for doing this testing. Continue to develop, expand, test, and share these testing methods across a wide array of software products

  1. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  2. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  3. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  4. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  5. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    Science.gov (United States)

    2013-01-25

    ... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...

  6. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  7. High accuracy of family history of melanoma in Danish melanoma cases.

    Science.gov (United States)

    Wadt, Karin A W; Drzewiecki, Krzysztof T; Gerdes, Anne-Marie

    2015-12-01

    The incidence of melanoma in Denmark has immensely increased over the last 10 years making Denmark a high risk country for melanoma. In the last two decades multiple public campaigns have sought to increase the awareness of melanoma. Family history of melanoma is a known major risk factor but previous studies have shown that self-reported family history of melanoma is highly inaccurate. These studies are 15 years old and we wanted to examine if a higher awareness of melanoma has increased the accuracy of self-reported family history of melanoma. We examined the family history of 181 melanoma probands who reported 199 cases of melanoma in relatives, of which 135 cases where in first degree relatives. We confirmed the diagnosis of melanoma in 77% of all relatives, and in 83% of first degree relatives. In 181 probands we validated the negative family history of melanoma in 748 first degree relatives and found only 1 case of melanoma which was not reported in a 3 case melanoma family. Melanoma patients in Denmark report family history of melanoma in first and second degree relatives with a high level of accuracy with a true positive predictive value between 77 and 87%. In 99% of probands reporting a negative family history of melanoma in first degree relatives this information is correct. In clinical practice we recommend that melanoma diagnosis in relatives should be verified if possible, but even unverified reported melanoma cases in relatives should be included in the indication of genetic testing and assessment of melanoma risk in the family.

  8. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  9. Diagnostic accuracy of cone-beam computed tomography scans with high- and low-resolution modes for the detection of root perforations.

    Science.gov (United States)

    Shokri, Abbas; Eskandarloo, Amir; Norouzi, Marouf; Poorolajal, Jalal; Majidi, Gelareh; Aliyaly, Alireza

    2018-03-01

    This study compared the diagnostic accuracy of cone-beam computed tomography (CBCT) scans obtained with 2 CBCT systems with high- and low-resolution modes for the detection of root perforations in endodontically treated mandibular molars. The root canals of 72 mandibular molars were cleaned and shaped. Perforations measuring 0.2, 0.3, and 0.4 mm in diameter were created at the furcation area of 48 roots, simulating strip perforations, or on the external surfaces of 48 roots, simulating root perforations. Forty-eight roots remained intact (control group). The roots were filled using gutta-percha (Gapadent, Tianjin, China) and AH26 sealer (Dentsply Maillefer, Ballaigues, Switzerland). The CBCT scans were obtained using the NewTom 3G (QR srl, Verona, Italy) and Cranex 3D (Soredex, Helsinki, Finland) CBCT systems in high- and low-resolution modes, and were evaluated by 2 observers. The chi-square test was used to assess the nominal variables. In strip perforations, the accuracies of low- and high-resolution modes were 75% and 83% for NewTom 3G and 67% and 69% for Cranex 3D. In root perforations, the accuracies of low- and high-resolution modes were 79% and 83% for NewTom 3G and was 56% and 73% for Cranex 3D. The accuracy of the 2 CBCT systems was different for the detection of strip and root perforations. The Cranex 3D had non-significantly higher accuracy than the NewTom 3G. In both scanners, the high-resolution mode yielded significantly higher accuracy than the low-resolution mode. The diagnostic accuracy of CBCT scans was not affected by the perforation diameter.

  10. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  11. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  12. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  13. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  14. Factors Determining the Inter-observer Variability and Diagnostic Accuracy of High-resolution Manometry for Esophageal Motility Disorders.

    Science.gov (United States)

    Kim, Ji Hyun; Kim, Sung Eun; Cho, Yu Kyung; Lim, Chul-Hyun; Park, Moo In; Hwang, Jin Won; Jang, Jae-Sik; Oh, Minkyung

    2018-01-30

    Although high-resolution manometry (HRM) has the advantage of visual intuitiveness, its diagnostic validity remains under debate. The aim of this study was to evaluate the diagnostic accuracy of HRM for esophageal motility disorders. Six staff members and 8 trainees were recruited for the study. In total, 40 patients enrolled in manometry studies at 3 institutes were selected. Captured images of 10 representative swallows and a single swallow in analyzing mode in both high-resolution pressure topography (HRPT) and conventional line tracing formats were provided with calculated metrics. Assessments of esophageal motility disorders showed fair agreement for HRPT and moderate agreement for conventional line tracing (κ = 0.40 and 0.58, respectively). With the HRPT format, the k value was higher in category A (esophagogastric junction [EGJ] relaxation abnormality) than in categories B (major body peristalsis abnormalities with intact EGJ relaxation) and C (minor body peristalsis abnormalities or normal body peristalsis with intact EGJ relaxation). The overall exact diagnostic accuracy for the HRPT format was 58.8% and rater's position was an independent factor for exact diagnostic accuracy. The diagnostic accuracy for major disorders was 63.4% with the HRPT format. The frequency of major discrepancies was higher for category B disorders than for category A disorders (38.4% vs 15.4%; P < 0.001). The interpreter's experience significantly affected the exact diagnostic accuracy of HRM for esophageal motility disorders. The diagnostic accuracy for major disorders was higher for achalasia than distal esophageal spasm and jackhammer esophagus.

  15. DIRECT GEOREFERENCING : A NEW STANDARD IN PHOTOGRAMMETRY FOR HIGH ACCURACY MAPPING

    Directory of Open Access Journals (Sweden)

    A. Rizaldy

    2012-07-01

    Full Text Available Direct georeferencing is a new method in photogrammetry, especially in the digital camera era. Theoretically, this method does not require ground control points (GCP and the Aerial Triangulation (AT, to process aerial photography into ground coordinates. Compared with the old method, this method has three main advantages: faster data processing, simple workflow and less expensive project, at the same accuracy. Direct georeferencing using two devices, GPS and IMU. GPS recording the camera coordinates (X, Y, Z, and IMU recording the camera orientation (omega, phi, kappa. Both parameters merged into Exterior Orientation (EO parameter. This parameters required for next steps in the photogrammetric projects, such as stereocompilation, DSM generation, orthorectification and mosaic. Accuracy of this method was tested on topographic map project in Medan, Indonesia. Large-format digital camera Ultracam X from Vexcel is used, while the GPS / IMU is IGI AeroControl. 19 Independent Check Point (ICP were used to determine the accuracy. Horizontal accuracy is 0.356 meters and vertical accuracy is 0.483 meters. Data with this accuracy can be used for 1:2.500 map scale project.

  16. On the numerical verification of industrial codes

    International Nuclear Information System (INIS)

    Montan, Sethy Akpemado

    2013-01-01

    Numerical verification of industrial codes, such as those developed at EDF R and D, is required to estimate the precision and the quality of computed results, even more for code running in HPC environments where millions of instructions are performed each second. These programs usually use external libraries (MPI, BLACS, BLAS, LAPACK). In this context, it is required to have a tool as non intrusive as possible to avoid rewriting the original code. In this regard, the CADNA library, which implements the Discrete Stochastic Arithmetic, appears to be one of a promising approach for industrial applications. In the first part of this work, we are interested in an efficient implementation of the BLAS routine DGEMM (General Matrix Multiply) implementing Discrete Stochastic Arithmetic. The implementation of a basic algorithm for matrix product using stochastic types leads to an overhead greater than 1000 for a matrix of 1024 * 1024 compared to the standard version and commercial versions of xGEMM. Here, we detail different solutions to reduce this overhead and the results we have obtained. A new routine Dgemm- CADNA have been designed. This routine has allowed to reduce the overhead from 1100 to 35 compare to optimized BLAS implementations (GotoBLAS). Then, we focus on the numerical verification of Telemac-2D computed results. Performing a numerical validation with the CADNA library shows that more than 30% of the numerical instabilities occurring during an execution come from the dot product function. A more accurate implementation of the dot product with compensated algorithms is presented in this work. We show that implementing these kinds of algorithms, in order to improve the accuracy of computed results does not alter the code performance. (author)

  17. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  18. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  19. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  20. High-accuracy identification and bioinformatic analysis of in vivo protein phosphorylation sites in yeast

    DEFF Research Database (Denmark)

    Gnad, Florian; de Godoy, Lyris M F; Cox, Jürgen

    2009-01-01

    Protein phosphorylation is a fundamental regulatory mechanism that affects many cell signaling processes. Using high-accuracy MS and stable isotope labeling in cell culture-labeling, we provide a global view of the Saccharomyces cerevisiae phosphoproteome, containing 3620 phosphorylation sites ma...

  1. Accuracy of whole-body plethysmography requires biological calibration

    DEFF Research Database (Denmark)

    Poorisrisak, Porntiva; Vrang, Carsten; Henriksen, Jorn Molgaard

    2009-01-01

    procedure: (1) seven healthy young children were brought to each of the six centers for sRaw measurements; and (2) 105 healthy preschool children (52 boys; mean age, 5.1 years; interquartile range, 4.3 to 6.0) were recruited locally for sRaw measurements. RESULTS: (1) The sRaw of the seven-children study...... preschool children) were generated and were without significant difference between centers and independent of height, weight, age, and gender. We subsequently pooled these normative data (105 children) with our previous data from 121 healthy young children (overall mean sRaw, 1.27; SD, 0.25). CONCLUSION......: Control using biological standards revealed errors in the factory setting and highlights the need for developing methods for verification of resistance measures to assure accuracy. Normative data were subsequently generated. Importantly, other centers using such normative data should first consider proper...

  2. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  3. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  4. Systemverilog for verification a guide to learning the testbench language features

    CERN Document Server

    Spear, Chris

    2012-01-01

    Based on the highly successful second edition, this extended edition of SystemVerilog for Verification: A Guide to Learning the Testbench Language Features teaches all verification features of the SystemVerilog language, providing hundreds of examples to clearly explain the concepts and basic fundamentals. It contains materials for both the full-time verification engineer and the student learning this valuable skill. In the third edition, authors Chris Spear and Greg Tumbush start with how to verify a design, and then use that context to demonstrate the language features,  including the advantages and disadvantages of different styles, allowing readers to choose between alternatives. This textbook contains end-of-chapter exercises designed to enhance students’ understanding of the material. Other features of this revision include: New sections on static variables, print specifiers, and DPI from the 2009 IEEE language standard Descriptions of UVM features such as factories, the test registry, and the config...

  5. High Accuracy Mass Measurement of the Dripline Nuclides $^{12,14}$Be

    CERN Multimedia

    2002-01-01

    State-of-the art, three-body nuclear models that describe halo nuclides require the binding energy of the halo neutron(s) as a critical input parameter. In the case of $^{14}$Be, the uncertainty of this quantity is currently far too large (130 keV), inhibiting efforts at detailed theoretical description. A high accuracy, direct mass deterlnination of $^{14}$Be (as well as $^{12}$Be to obtain the two-neutron separation energy) is therefore required. The measurement can be performed with the MISTRAL spectrometer, which is presently the only possible solution due to required accuracy (10 keV) and short half-life (4.5 ms). Having achieved a 5 keV uncertainty for the mass of $^{11}$Li (8.6 ms), MISTRAL has proved the feasibility of such measurements. Since the current ISOLDE production rate of $^{14}$Be is only about 10/s, the installation of a beam cooler is underway in order to improve MISTRAL transmission. The projected improvement of an order of magnitude (in each transverse direction) will make this measureme...

  6. High Accuracy Beam Current Monitor System for CEBAF'S Experimental Hall A

    International Nuclear Information System (INIS)

    J. Denard; A. Saha; G. Lavessiere

    2001-01-01

    CEBAF accelerator delivers continuous wave (CW) electron beams to three experimental Halls. In Hall A, all experiments require continuous, non-invasive current measurements and a few experiments require an absolute accuracy of 0.2 % in the current range from 1 to 180 (micro)A. A Parametric Current Transformer (PCT), manufactured by Bergoz, has an accurate and stable sensitivity of 4 (micro)A/V but its offset drifts at the muA level over time preclude its direct use for continuous measurements. Two cavity monitors are calibrated against the PCT with at least 50 (micro)A of beam current. The calibration procedure suppresses the error due to PCT's offset drifts by turning the beam on and off, which is invasive to the experiment. One of the goals of the system is to minimize the calibration time without compromising the measurement's accuracy. The linearity of the cavity monitors is a critical parameter for transferring the accurate calibration done at high currents over the whole dynamic range. The method for measuring accurately the linearity is described

  7. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  8. The US National Resources Defense Council/Soviet Academy of Sciences Nuclear Test Ban Verification Project

    International Nuclear Information System (INIS)

    Cochran, T.B.

    1989-01-01

    The first week in September 1987 was an extraordinary one for arms control verification. As part of the co-operative Test Ban Verification Project of the Natural Resources Defense Council (NRDC) and the Soviet Academy of Sciences, fourteen American scientists from the Scripps Institution of Oceanography (at the University of California- San Diego), University of Nevada-Reno and the University of Colorado went to the region of the Soviet's principal nuclear test site near Semipalatinsk. Together with their Soviet counterparts from the Institute of Physics of the Earth (IPE) in Moscow, they fired off three large chemical explosions. The purpose of these explosions was to demonstrate the sensitivity of the three seismic stations surrounding the test site, to study the efficiency with which high-frequency seismic waves propagate in the region, and to study differences between chemical explosions, nuclear explosions and earthquakes in order more firmly to establish procedures for verification of a nuclear test ban. This paper presents a review of the results of these experiments, an update on the status of the joint project, and a review of the significance of high frequency seismic data to test ban verification

  9. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    Science.gov (United States)

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  10. High-accuracy determination of the neutron flux at n{sub T}OF

    Energy Technology Data Exchange (ETDEWEB)

    Barbagallo, M.; Colonna, N.; Mastromarco, M.; Meaze, M.; Tagliente, G.; Variale, V. [Sezione di Bari, INFN, Bari (Italy); Guerrero, C.; Andriamonje, S.; Boccone, V.; Brugger, M.; Calviani, M.; Cerutti, F.; Chin, M.; Ferrari, A.; Kadi, Y.; Losito, R.; Versaci, R.; Vlachoudis, V. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Tsinganis, A. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); National Technical University of Athens (NTUA), Athens (Greece); Tarrio, D.; Duran, I.; Leal-Cidoncha, E.; Paradela, C. [Universidade de Santiago de Compostela, Santiago (Spain); Altstadt, S.; Goebel, K.; Langer, C.; Reifarth, R.; Schmidt, S.; Weigand, M. [Johann-Wolfgang-Goethe Universitaet, Frankfurt (Germany); Andrzejewski, J.; Marganiec, J.; Perkowski, J. [Uniwersytet Lodzki, Lodz (Poland); Audouin, L.; Leong, L.S.; Tassan-Got, L. [Centre National de la Recherche Scientifique/IN2P3 - IPN, Orsay (France); Becares, V.; Cano-Ott, D.; Garcia, A.R.; Gonzalez-Romero, E.; Martinez, T.; Mendoza, E. [Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (CIEMAT), Madrid (Spain); Becvar, F.; Krticka, M.; Kroll, J.; Valenta, S. [Charles University, Prague (Czech Republic); Belloni, F.; Fraval, K.; Gunsing, F.; Lampoudis, C.; Papaevangelou, T. [Commissariata l' Energie Atomique (CEA) Saclay - Irfu, Gif-sur-Yvette (France); Berthoumieux, E.; Chiaveri, E. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Commissariata l' Energie Atomique (CEA) Saclay - Irfu, Gif-sur-Yvette (France); Billowes, J.; Ware, T.; Wright, T. [University of Manchester, Manchester (United Kingdom); Bosnar, D.; Zugec, P. [University of Zagreb, Department of Physics, Faculty of Science, Zagreb (Croatia); Calvino, F.; Cortes, G.; Gomez-Hornillos, M.B.; Riego, A. [Universitat Politecnica de Catalunya, Barcelona (Spain); Carrapico, C.; Goncalves, I.F.; Sarmento, R.; Vaz, P. [Universidade Tecnica de Lisboa, Instituto Tecnologico e Nuclear, Instituto Superior Tecnico, Lisboa (Portugal); Cortes-Giraldo, M.A.; Praena, J.; Quesada, J.M.; Sabate-Gilarte, M. [Universidad de Sevilla, Sevilla (Spain); Diakaki, M.; Karadimos, D.; Kokkoris, M.; Vlastou, R. [National Technical University of Athens (NTUA), Athens (Greece); Domingo-Pardo, C.; Giubrone, G.; Tain, J.L. [CSIC-Universidad de Valencia, Instituto de Fisica Corpuscular, Valencia (Spain); Dressler, R.; Kivel, N.; Schumann, D.; Steinegger, P. [Paul Scherrer Institut, Villigen PSI (Switzerland); Dzysiuk, N.; Mastinu, P.F. [Laboratori Nazionali di Legnaro, INFN, Rome (Italy); Eleftheriadis, C.; Manousos, A. [Aristotle University of Thessaloniki, Thessaloniki (Greece); Ganesan, S.; Gurusamy, P.; Saxena, A. [Bhabha Atomic Research Centre (BARC), Mumbai (IN); Griesmayer, E.; Jericha, E.; Leeb, H. [Technische Universitaet Wien, Atominstitut, Wien (AT); Hernandez-Prieto, A. [European Organization for Nuclear Research (CERN), Geneva (CH); Universitat Politecnica de Catalunya, Barcelona (ES); Jenkins, D.G.; Vermeulen, M.J. [University of York, Heslington, York (GB); Kaeppeler, F. [Institut fuer Kernphysik, Karlsruhe Institute of Technology, Campus Nord, Karlsruhe (DE); Koehler, P. [Oak Ridge National Laboratory (ORNL), Oak Ridge (US); Lederer, C. [Johann-Wolfgang-Goethe Universitaet, Frankfurt (DE); University of Vienna, Faculty of Physics, Vienna (AT); Massimi, C.; Mingrone, F.; Vannini, G. [Universita di Bologna (IT); INFN, Sezione di Bologna, Dipartimento di Fisica, Bologna (IT); Mengoni, A.; Ventura, A. [Agenzia nazionale per le nuove tecnologie, l' energia e lo sviluppo economico sostenibile (ENEA), Bologna (IT); Milazzo, P.M. [Sezione di Trieste, INFN, Trieste (IT); Mirea, M. [Horia Hulubei National Institute of Physics and Nuclear Engineering - IFIN HH, Bucharest - Magurele (RO); Mondalaers, W.; Plompen, A.; Schillebeeckx, P. [Institute for Reference Materials and Measurements, European Commission JRC, Geel (BE); Pavlik, A.; Wallner, A. [University of Vienna, Faculty of Physics, Vienna (AT); Rauscher, T. [University of Basel, Department of Physics and Astronomy, Basel (CH); Roman, F. [European Organization for Nuclear Research (CERN), Geneva (CH); Horia Hulubei National Institute of Physics and Nuclear Engineering - IFIN HH, Bucharest - Magurele (RO); Rubbia, C. [European Organization for Nuclear Research (CERN), Geneva (CH); Laboratori Nazionali del Gran Sasso dell' INFN, Assergi (AQ) (IT); Weiss, C. [European Organization for Nuclear Research (CERN), Geneva (CH); Johann-Wolfgang-Goethe Universitaet, Frankfurt (DE)

    2013-12-15

    The neutron flux of the n{sub T}OF facility at CERN was measured, after installation of the new spallation target, with four different systems based on three neutron-converting reactions, which represent accepted cross sections standards in different energy regions. A careful comparison and combination of the different measurements allowed us to reach an unprecedented accuracy on the energy dependence of the neutron flux in the very wide range (thermal to 1 GeV) that characterizes the n{sub T}OF neutron beam. This is a pre-requisite for the high accuracy of cross section measurements at n{sub T}OF. An unexpected anomaly in the neutron-induced fission cross section of {sup 235}U is observed in the energy region between 10 and 30keV, hinting at a possible overestimation of this important cross section, well above currently assigned uncertainties. (orig.)

  11. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  12. SU-E-T-563: Multi-Fraction Stereotactic Radiosurgery with Extend System of Gamma Knife: Treatment Verification Using Indigenously Designed Patient Simulating Multipurpose Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bisht, R; Kale, S; Gopishankar, N; Rath, G; Julka, P; Agarwal, D; Singh, M; Garg, A; Kumar, P; Thulkar, S; Sharma, B [All India Institute of Medical Sciences, New Delhi (India)

    2015-06-15

    Purpose: Aim of the study is to evaluate mechanical and radiological accuracy of multi-fraction regimen and validate Gamma knife based fractionation using newly developed patient simulating multipurpose phantom. Methods: A patient simulating phantom was designed to verify fractionated treatments with extend system (ES) of Gamma Knife however it could be used to validate other radiotherapy procedures as well. The phantom has options to insert various density material plugs and mini CT/MR distortion phantoms to analyze the quality of stereotactic imaging. An additional thorax part designed to predict surface doses at various organ sites. The phantom was positioned using vacuum head cushion and patient control unit for imaging and treatment. The repositioning check tool (RCT) was used to predict phantom positioning under ES assembly. The phantom with special inserts for film in axial, coronal and sagittal plane were scanned with X-Ray CT and the acquired images were transferred to treatment planning system (LGP 10.1). The focal precession test was performed with 4mm collimator and an experimental plan of four 16mm collimator shots was prepared for treatment verification of multi-fraction regimen. The prescription dose of 5Gy per fraction was delivered in four fractions. Each fraction was analyzed using EBT3 films scanned with EPSON 10000XL Scanner. Results: The measurement of 38 RCT points showed an overall positional accuracy of 0.28mm. The mean deviation of 0.28% and 0.31 % were calculated as CT and MR image distortion respectively. The radiological focus accuracy test showed its deviation from mechanical center point of 0.22mm. The profile measurement showed close agreement between TPS planned and film measured dose. At tolerance criteria of 1%/1mm gamma index analysis showed a pass rate of > 95%. Conclusion: Our results show that the newly developed multipurpose patient simulating phantom is highly suitable for the verification of fractionated stereotactic

  13. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  14. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  15. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  16. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  17. Precision, accuracy, cross reactivity and comparability of serum indices measurement on Abbott Architect c8000, Beckman Coulter AU5800 and Roche Cobas 6000 c501 clinical chemistry analyzers.

    Science.gov (United States)

    Nikolac Gabaj, Nora; Miler, Marijana; Vrtarić, Alen; Hemar, Marina; Filipi, Petra; Kocijančić, Marija; Šupak Smolčić, Vesna; Ćelap, Ivana; Šimundić, Ana-Maria

    2018-04-25

    The aim of our study was to perform verification of serum indices on three clinical chemistry platforms. This study was done on three analyzers: Abbott Architect c8000, Beckman Coulter AU5800 (BC) and Roche Cobas 6000 c501. The following analytical specifications were verified: precision (two patient samples), accuracy (sample with the highest concentration of interferent was serially diluted and measured values compared to theoretical values), comparability (120 patients samples) and cross reactivity (samples with increasing concentrations of interferent were divided in two aliquots and remaining interferents were added in each aliquot. Measurements were done before and after adding interferents). Best results for precision were obtained for the H index (0.72%-2.08%). Accuracy for the H index was acceptable for Cobas and BC, while on Architect, deviations in the high concentration range were observed (y=0.02 [0.01-0.07]+1.07 [1.06-1.08]x). All three analyzers showed acceptable results in evaluating accuracy of L index and unacceptable results for I index. The H index was comparable between BC and both, Architect (Cohen's κ [95% CI]=0.795 [0.692-0.898]) and Roche (Cohen's κ [95% CI]=0.825 [0.729-0.922]), while Roche and Architect were not comparable. The I index was not comparable between all analyzer combinations, while the L index was only comparable between Abbott and BC. Cross reactivity analysis mostly showed that serum indices measurement is affected when a combination of interferences is present. There is heterogeneity between analyzers in the hemolysis, icteria, lipemia (HIL) quality performance. Verification of serum indices in routine work is necessary to establish analytical specifications.

  18. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  19. High accuracy microwave frequency measurement based on single-drive dual-parallel Mach-Zehnder modulator

    DEFF Research Database (Denmark)

    Zhao, Ying; Pang, Xiaodan; Deng, Lei

    2011-01-01

    A novel approach for broadband microwave frequency measurement by employing a single-drive dual-parallel Mach-Zehnder modulator is proposed and experimentally demonstrated. Based on bias manipulations of the modulator, conventional frequency-to-power mapping technique is developed by performing a...... 10−3 relative error. This high accuracy frequency measurement technique is a promising candidate for high-speed electronic warfare and defense applications....

  20. Innovative Technique for High-Accuracy Remote Monitoring of Surface Water

    Science.gov (United States)

    Gisler, A.; Barton-Grimley, R. A.; Thayer, J. P.; Crowley, G.

    2016-12-01

    Lidar (light detection and ranging) provides absolute depth and topographic mapping capability compared to other remote sensing methods, which is useful for mapping rapidly changing environments such as riverine systems and agricultural waterways. Effectiveness of current lidar bathymetric systems is limited by the difficulty in unambiguously identifying backscattered lidar signals from the water surface versus the bottom, limiting their depth resolution to 0.3-0.5 m. Additionally these are large, bulky systems that are constrained to expensive aircraft-mounted platforms and use waveform-processing techniques requiring substantial computation time. These restrictions are prohibitive for many potential users. A novel lidar device has been developed that allows for non-contact measurements of water depth down to 1 cm with an accuracy and precision of shallow to deep water allowing for shoreline charting, measuring water volume, mapping bottom topology, and identifying submerged objects. The scalability of the technique opens up the ability for handheld or UAS-mounted lidar bathymetric systems, which provides for potential applications currently unavailable to the community. The high laser pulse repetition rate allows for very fine horizontal resolution while the photon-counting technique permits real-time depth measurement and object detection. The enhanced measurement capability, portability, scalability, and relatively low-cost creates the opportunity to perform frequent high-accuracy monitoring and measuring of aquatic environments which is crucial for monitoring water resources on fast timescales. Results from recent campaigns measuring water depth in flowing creeks and murky ponds will be presented which demonstrate that the method is not limited by rough water surfaces and can map underwater topology through moderately turbid water.

  1. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    Science.gov (United States)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence

  2. Comparison of monitor units calculated by radiotherapy treatment planning system and an independent monitor unit verification software.

    Science.gov (United States)

    Sellakumar, P; Arun, C; Sanjay, S S; Ramesh, S B

    2011-01-01

    In radiation therapy, the monitor units (MU) needed to deliver a treatment plan are calculated by treatment planning systems (TPS). The essential part of quality assurance is to verify the MU with independent monitor unit calculation to correct any potential errors prior to the start of treatment. In this study, we have compared the MU calculated by TPS and by independent MU verification software. The MU verification software was commissioned and tested for the data integrity to ensure that the correct beam data was considered for MU calculations. The accuracy of the calculations was tested by creating a series of test plans and comparing them with ion chamber measurements. The results show that there is good agreement between the two. The MU difference (MUdiff) between the monitor unit calculations of TPS and independent MU verification system was calculated for 623 fields from 245 patients and was analyzed by treatment site for head & neck, thorax, breast, abdomen and pelvis. The mean MUdiff of -0.838% with a standard deviation of 3.04% was observed for all 623 fields. The site specific standard deviation of MUdiff was as follows: abdomen and pelvis (<1.75%), head & neck (2.5%), thorax (2.32%) and breast (6.01%). The disparities were analyzed and different correction methods were used to reduce the disparity. © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. A survey on the high reliability software verification and validation technology for instrumentation and control in NPP.

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Lee, Chang Soo; Dong, In Sook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-01-01

    This document presents the technical status of the software verification and validation (V and V) efforts to support developing and licensing digital instrumentation and control (I and C) systems in nuclear power plants. We have reviewed codes and standards to be concensus criteria among vendor, licensee and licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 of the United States cope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. These technical issues let us know the development direction of our own software V and V methodology. (Author) 13 refs., 2 figs.,.

  4. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    Science.gov (United States)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  5. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  6. SU-E-T-442: Geometric Calibration and Verification of a GammaPod Breast SBRT System

    Energy Technology Data Exchange (ETDEWEB)

    Yu, C [Univ Maryland School of Medicine, Baltimore, MD (United States); Xcision Medical Systems, Columbia, MD (United States); Niu, Y; Maton, P; Hoban, P [Xcision Medical Systems, Columbia, MD (United States); Mutaf, Y [Univ Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: The first GammaPod™ unit for prone stereotactic treatment of early stage breast cancer has recently been installed and calibrated. Thirty-six rotating circular Co-60 beams focus dose at an isocenter that traverses throughout a breast target via continuous motion of the treatment table. The breast is immobilized and localized using a vacuum-assisted stereotactic cup system that is fixed to the table during treatment. Here we report on system calibration and on verification of geometric and dosimetric accuracy. Methods: Spatial calibration involves setting the origin of each table translational axis within the treatment control system such that the relationship between beam isocenter and table geometry is consistent with that assumed by the treatment planning system. A polyethylene QA breast phantom inserted into an aperture in the patient couch is used for calibration and verification. The comparison is performed via fiducial-based registration of measured single-isocenter dose profiles (radiochromic film) with kernel dose profiles. With the table calibrations applied, measured relative dose distributions were compared with TPS calculations for single-isocenter and dynamic (many-isocenter) treatment plans. Further, table motion accuracy and linearity was tested via comparison of planned control points with independent encoder readouts. Results: After table calibration, comparison of measured and calculated single-isocenter dose profiles show agreement to within 0.5 mm for each axis. Gamma analysis of measured vs calculated profiles with 3%/2mm criteria yields a passing rate of >99% and >98% for single-isocenter and dynamic plans respectively. This also validates the relative dose distributions produced by the TPS. Measured table motion accuracy was within 0.05 mm for all translational axes. Conclusion: GammaPod table coordinate calibration is a straightforward process that yields very good agreement between planned and measured relative dose distributions

  7. A generalized polynomial chaos based ensemble Kalman filter with high accuracy

    International Nuclear Information System (INIS)

    Li Jia; Xiu Dongbin

    2009-01-01

    As one of the most adopted sequential data assimilation methods in many areas, especially those involving complex nonlinear dynamics, the ensemble Kalman filter (EnKF) has been under extensive investigation regarding its properties and efficiency. Compared to other variants of the Kalman filter (KF), EnKF is straightforward to implement, as it employs random ensembles to represent solution states. This, however, introduces sampling errors that affect the accuracy of EnKF in a negative manner. Though sampling errors can be easily reduced by using a large number of samples, in practice this is undesirable as each ensemble member is a solution of the system of state equations and can be time consuming to compute for large-scale problems. In this paper we present an efficient EnKF implementation via generalized polynomial chaos (gPC) expansion. The key ingredients of the proposed approach involve (1) solving the system of stochastic state equations via the gPC methodology to gain efficiency; and (2) sampling the gPC approximation of the stochastic solution with an arbitrarily large number of samples, at virtually no additional computational cost, to drastically reduce the sampling errors. The resulting algorithm thus achieves a high accuracy at reduced computational cost, compared to the classical implementations of EnKF. Numerical examples are provided to verify the convergence property and accuracy improvement of the new algorithm. We also prove that for linear systems with Gaussian noise, the first-order gPC Kalman filter method is equivalent to the exact Kalman filter.

  8. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    Science.gov (United States)

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  9. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications

    Directory of Open Access Journals (Sweden)

    Kobayashi Hiroki

    2012-04-01

    Full Text Available Abstract Background Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. Results We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system. As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. Conclusions This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  10. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  11. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  12. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  13. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Phillip A.; O' Hagan, Ryan; Shumaker, Brent; Hashemian, H. M.

    2017-03-01

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carried out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.

  14. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  15. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  16. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  17. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  18. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  19. Getting ready for final disposal in Finland - Independent verification of spent fuel

    International Nuclear Information System (INIS)

    Tarvainen, Matti; Honkamaa, Tapani; Martikka, Elina; Varjoranta, Tero; Hautamaeki, Johanna; Tiitta, Antero

    2001-01-01

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  20. High-accuracy defect sizing for CRDM penetration adapters using the ultrasonic TOFD technique

    International Nuclear Information System (INIS)

    Atkinson, I.

    1995-01-01

    Ultrasonic time-of-flight diffraction (TOFD) is the preferred technique for critical sizing of throughwall orientated defects in a wide range of components, primarily because it is intrinsically more accurate than amplitude-based techniques. For the same reason, TOFD is the preferred technique for sizing the cracks in control rod drive mechanism (CRDM) penetration adapters, which have been the subject of much recent attention. Once the considerable problem of restricted access for the UT probes has been overcome, this inspection lends itself to very high accuracy defect sizing using TOFD. In qualification trials under industrial conditions, depth sizing to an accuracy of ≤ 0.5 mm has been routinely achieved throughout the full wall thickness (16 mm) of the penetration adapters, using only a single probe pair and without recourse to signal processing. (author)

  1. Z-2 Architecture Description and Requirements Verification Results

    Science.gov (United States)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag

  2. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  3. Analyzing thematic maps and mapping for accuracy

    Science.gov (United States)

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  4. High accuracy mantle convection simulation through modern numerical methods

    KAUST Repository

    Kronbichler, Martin

    2012-08-21

    Numerical simulation of the processes in the Earth\\'s mantle is a key piece in understanding its dynamics, composition, history and interaction with the lithosphere and the Earth\\'s core. However, doing so presents many practical difficulties related to the numerical methods that can accurately represent these processes at relevant scales. This paper presents an overview of the state of the art in algorithms for high-Rayleigh number flows such as those in the Earth\\'s mantle, and discusses their implementation in the Open Source code Aspect (Advanced Solver for Problems in Earth\\'s ConvecTion). Specifically, we show how an interconnected set of methods for adaptive mesh refinement (AMR), higher order spatial and temporal discretizations, advection stabilization and efficient linear solvers can provide high accuracy at a numerical cost unachievable with traditional methods, and how these methods can be designed in a way so that they scale to large numbers of processors on compute clusters. Aspect relies on the numerical software packages deal.II and Trilinos, enabling us to focus on high level code and keeping our implementation compact. We present results from validation tests using widely used benchmarks for our code, as well as scaling results from parallel runs. © 2012 The Authors Geophysical Journal International © 2012 RAS.

  5. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  6. Construction of accuracy-preserving surrogate for the eigenvalue radiation diffusion and/or transport problem

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.; Abdel-Khalik, H. S. [Dept. of Nuclear Engineering, North Caroline State Univ., Raleigh, NC 27695 (United States)

    2012-07-01

    The construction of surrogate models for high fidelity models is now considered an important objective in support of all engineering activities which require repeated execution of the simulation, such as verification studies, validation exercises, and uncertainty quantification. The surrogate must be computationally inexpensive to allow its repeated execution, and must be computationally accurate in order for its predictions to be credible. This manuscript introduces a new surrogate construction approach that reduces the dimensionality of the state solution via a range-finding algorithm from linear algebra. It then employs a proper orthogonal decomposition-like approach to solve for the reduced state. The algorithm provides an upper bound on the error resulting from the reduction. Different from the state-of-the-art, the new approach allows the user to define the desired accuracy a priori which controls the maximum allowable reduction. We demonstrate the utility of this approach using an eigenvalue radiation diffusion model, where the accuracy is selected to match machine precision. Results indicate that significant reduction is possible for typical reactor assembly models, which are currently considered expensive given the need to employ very fine mesh many group calculations to ensure the highest possible fidelity for the downstream core calculations. Given the potential for significant reduction in the computational cost, we believe it is possible to rethink the manner in which homogenization theory is currently employed in reactor design calculations. (authors)

  7. Development and Performance Verification of the GANDALF High-Resolution Transient Recorder System

    CERN Document Server

    Bartknecht, Stefan; Herrmann, Florian; Königsmann, Kay; Lauser, Louis; Schill, Christian; Schopferer, Sebastian; Wollny, Heiner

    2011-01-01

    With present-day detectors in high energy physics one often faces fast analog pulses of a few nanoseconds length which cover large dynamic ranges. In many experiments both amplitude and timing information have to be measured with high accuracy. Additionally, the data rate per readout channel can reach several MHz, which leads to high demands on the separation of pile-up pulses. For an upgrade of the COMPASS experiment at CERN we have designed the GANDALF transient recorder with a resolution of 12bit@1GS/s and an analog bandwidth of 500\\:MHz. Signals are digitized with high precision and processed by fast algorithms to extract pulse arrival times and amplitudes in real-time and to generate trigger signals for the experiment. With up to 16 analog channels, deep memories and a high data rate interface, this 6U-VME64x/VXS module is not only a dead-time free digitization unit but also has huge numerical capabilities provided by the implementation of a Virtex5-SXT FPGA. Fast algorithms implemented in the FPGA may b...

  8. Numerical Verification Methods for Spherical $t$-Designs

    OpenAIRE

    Chen, Xiaojun

    2009-01-01

    The construction of spherical $t$-designs with $(t+1)^2$ points on the unit sphere $S^2$ in $\\mathbb{R}^3$ can be reformulated as an underdetermined system of nonlinear equations. This system is highly nonlinear and involves the evaluation of a degree $t$ polynomial in $(t+1)^4$ arguments. This paper reviews numerical verification methods using the Brouwer fixed point theorem and Krawczyk interval operator for solutions of the underdetermined system of nonlinear equations...

  9. Measurement system with high accuracy for laser beam quality.

    Science.gov (United States)

    Ke, Yi; Zeng, Ciling; Xie, Peiyuan; Jiang, Qingshan; Liang, Ke; Yang, Zhenyu; Zhao, Ming

    2015-05-20

    Presently, most of the laser beam quality measurement system collimates the optical path manually with low efficiency and low repeatability. To solve these problems, this paper proposed a new collimated method to improve the reliability and accuracy of the measurement results. The system accuracy controlled the position of the mirror to change laser beam propagation direction, which can realize the beam perpendicularly incident to the photosurface of camera. The experiment results show that the proposed system has good repeatability and the measuring deviation of M2 factor is less than 0.6%.

  10. Accuracy of daily image guidance for hypofractionated liver radiotherapy with active breathing control

    International Nuclear Information System (INIS)

    Dawson, Laura A.; Eccles, Cynthia; Bissonnette, Jean-Pierre; Brock, Kristy K.

    2005-01-01

    Purpose: A six-fraction, high-precision radiotherapy protocol for unresectable liver cancer has been developed in which active breathing control (ABC) is used to immobilize the liver and daily megavoltage (MV) imaging and repositioning is used to decrease geometric uncertainties. We report the accuracy of setup in the first 20 patients consecutively treated using this approach. Methods and materials: After setup using conventional skin marks and lasers, orthogonal MV images were acquired with the liver immobilized using ABC. The images were aligned to reference digitally reconstructed radiographs using the diaphragm for craniocaudal (CC) alignment and the vertebral bodies for anterior-posterior (AP) and mediolateral (ML) alignment. Adjustments were made for positioning errors >3 mm. Verification imaging was repeated after repositioning to assess for residual positioning error. Offline image matching was conducted to determine the setup accuracy using this approach compared with the initial setup error before repositioning. Real-time beam's-eye-view MV movies containing an air-diaphragm interface were also evaluated. Results: A total of 405 images were evaluated from 20 patients. Repositioning occurred in 109 of 120 fractions because of offsets >3 mm. Three to eight beam angles, with up to four segments per field, were used for each isocenter. Breath holds of up to 27 s were used for imaging and treatment. The average time from the initial verification image to the last treatment beam was 21 min. Image guidance and repositioning reduced the population random setup errors (σ) from 6.5 mm (CC), 4.2 mm (ML), and 4.7 mm (AP) to 2.5 mm (CC), 2.8 mm (ML), and 2.9 mm (AP). The average individual random setup errors (σ) were reduced from 4.5 mm (CC), 3.2 mm (AP), and 2.5 mm (ML) to 2.2 mm (CC), 2.0 mm (AP), and 2.0 mm (ML). The standard deviation of the distribution of systematic deviations (Σ) was also reduced from 5.1 mm (CC), 3.4 mm (ML), and 3.1 mm (AP) to 1.4 mm (CC

  11. Accuracy of High-Resolution Ultrasonography in the Detection of Extensor Tendon Lacerations.

    Science.gov (United States)

    Dezfuli, Bobby; Taljanovic, Mihra S; Melville, David M; Krupinski, Elizabeth A; Sheppard, Joseph E

    2016-02-01

    Lacerations to the extensor mechanism are usually diagnosed clinically. Ultrasound (US) has been a growing diagnostic tool for tendon injuries since the 1990s. To date, there has been no publication establishing the accuracy and reliability of US in the evaluation of extensor mechanism lacerations in the hand. The purpose of this study is to determine the accuracy of US to detect extensor tendon injuries in the hand. Sixteen fingers and 4 thumbs in 4 fresh-frozen and thawed cadaveric hands were used. Sixty-eight 0.5-cm transverse skin lacerations were created. Twenty-seven extensor tendons were sharply transected. The remaining skin lacerations were used as sham dissection controls. One US technologist and one fellowship-trained musculoskeletal radiologist performed real-time dynamic US studies in and out of water bath. A second fellowship trained musculoskeletal radiologist subsequently reviewed the static US images. Dynamic and static US interpretation accuracy was assessed using dissection as "truth." All 27 extensor tendon lacerations and controls were identified correctly with dynamic imaging as either injury models that had a transected extensor tendon or sham controls with intact extensor tendons (sensitivity = 100%, specificity = 100%, positive predictive value = 1.0; all significantly greater than chance). Static imaging had a sensitivity of 85%, specificity of 89%, and accuracy of 88% (all significantly greater than chance). The results of the dynamic real time versus static US imaging were clearly different but did not reach statistical significance. Diagnostic US is a very accurate noninvasive study that can identify extensor mechanism injuries. Clinically suspected cases of acute extensor tendon injury scanned by high-frequency US can aid and/or confirm the diagnosis, with dynamic imaging providing added value compared to static. Ultrasonography, to aid in the diagnosis of extensor mechanism lacerations, can be successfully used in a reliable and

  12. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  13. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  14. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  15. Accuracy of axial depth of cut in micromilling operations - Simplified procedure and uncertainty model

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    2005-01-01

    , due to the easy breakage particularly when milling on hard materials [1]. Typical values for the errors on the control of the axial depth of cut are in the order of 50 microns, while the aimed depth of cut can be as low as 5 microns. The author has developed a machining procedure for optimal control...... of this investigation is the determination of the uncertainty of the set depth of cut, using the developed procedure, in a range of practical operating conditions and thereby the estimation of the expected accuracy of the method prior to verification of the machined parts....

  16. Being known, intimate, and valued: global self-verification and dyadic adjustment in couples and roommates.

    Science.gov (United States)

    Katz, Jennifer; Joiner, Thomas E

    2002-02-01

    We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.

  17. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  18. Elasto-plastic benchmark calculations. Step 1: verification of the numerical accuracy of the computer programs

    International Nuclear Information System (INIS)

    Corsi, F.

    1985-01-01

    In connection with the design of nuclear reactors components operating at elevated temperature, design criteria need a level of realism in the prediction of inelastic structural behaviour. This concept leads to the necessity of developing non linear computer programmes, and, as a consequence, to the problems of verification and qualification of these tools. Benchmark calculations allow to carry out these two actions, involving at the same time an increased level of confidence in complex phenomena analysis and in inelastic design calculations. With the financial and programmatic support of the Commission of the European Communities (CEE) a programme of elasto-plastic benchmark calculations relevant to the design of structural components for LMFBR has been undertaken by those Member States which are developing a fast reactor project. Four principal progressive aims were initially pointed out that brought to the decision to subdivide the Benchmark effort in a calculations series of four sequential steps: step 1 to 4. The present document tries to summarize Step 1 of the Benchmark exercise, to derive some conclusions on Step 1 by comparison of the results obtained with the various codes and to point out some concluding comments on the first action. It is to point out that even if the work was designed to test the capabilities of the computer codes, another aim was to increase the skill of the users concerned

  19. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  20. Enhancing the Accuracy of Advanced High Temperature Mechanical Testing through Thermography

    Directory of Open Access Journals (Sweden)

    Jonathan Jones

    2018-03-01

    Full Text Available This paper describes the advantages and enhanced accuracy thermography provides to high temperature mechanical testing. This technique is not only used to monitor, but also to control test specimen temperatures where the infra-red technique enables accurate non-invasive control of rapid thermal cycling for non-metallic materials. Isothermal and dynamic waveforms are employed over a 200–800 °C temperature range to pre-oxidised and coated specimens to assess the capability of the technique. This application shows thermography to be accurate to within ±2 °C of thermocouples, a standardised measurement technique. This work demonstrates the superior visibility of test temperatures previously unobtainable by conventional thermocouples or even more modern pyrometers that thermography can deliver. As a result, the speed and accuracy of thermal profiling, thermal gradient measurements and cold/hot spot identification using the technique has increased significantly to the point where temperature can now be controlled by averaging over a specified area. The increased visibility of specimen temperatures has revealed additional unknown effects such as thermocouple shadowing, preferential crack tip heating within an induction coil, and, fundamental response time of individual measurement techniques which are investigated further.